var/home/core/zuul-output/0000755000175000017500000000000015146772660014543 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015147020442015470 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000353154715147020267020274 0ustar corecore ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf􅴟lEڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W5!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPFόRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*KdW9lpx˯f<A: :wƗhhj+Kh`! x0 3WO-W"wC1qE.HK`}o9O\4\rJrLN|oۭzEK?k>i]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.N`$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'[-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4Aْ .vqHP"P.dTrcD Yjz_aL_8};\N<:R€ N0RQ⚮FkeZ< )VCRQrC|}nw_~ܥ0~fgKAw^};fs)1K MޠPBUB1J{Ⱦ79`®3uO0T-Oy+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩLH.:;ߡ֖QʡCOx]*9W C;6)^W;g ;ruw~J03T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}~d i:Y`cФIX0$AtĘ5dw9}ŒEanvVZ?c}!wO,ƍͩ?9} [oF2(Y}Q7^{E}xA|AŜt;y}=W<*e'&Ж0(ݕ`{az^su/x)W>OK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7쿻чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2aޙ-did˥]5]5᪩QJlyIPEQZȰ<'Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.O8ϵq\FOXƀf qbTLhlw?8p@{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟ^T;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>fXmpLJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==ck vz(vb$^Nyo$p[DtUCE9sBz%lOONRѦmDVmxюݏX}K6"Qi32\-V_kR(I-wtSJR^m{d a|y,F9$^@mdH֙toN1 < ҷBq/ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDfn#y 9D*A$$"^)dVQ.(rO6ӟZw_Ȣaޒu'- ^_,G;U\cAAz7EtlLuoXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-qCfqsn[?_r=V:х@mfVg,w}QJUtesYyt7Yr+"*DtO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KOAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэoV#ہll/ؽnA(ȱbAj>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qrx!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa'$ 4cjF_q~lITQu d Vݭ5F\d G!:鞞VwfD5T|Ɇ^;C)ZY UIb>ܓ[懪g7 ㇪ H` QKaeQrV M96PE,+aE&nk=`eTZ7wYr< <$*LM|`ՀOK8$ a 9żhc%Ah8QlGMq^H~wʧsZLaKoT%#]QH{&|2ۦ^\ޅ^wLL+p^{/<*㸬-( ~":*x}Ӯwl&@7t Dc=֛7#;iڿDH~}y eꊍ̈Y~VzvdrAwD4(kXr/fK.g<&^j {lO%E۱hԊe=*$VD^/w3vZ΂/(J6Jeό}b{4 ̱bˎbEN,qL˱0\:I44€ 4)Qv](ǓXZpD*(Ie&bM"7 syUҢ㑖bibCa`ŏúCodz=@[F#Xf| '3QX>JXLъ%oFq9*hIXHDHt騚Q\PɂJKԛ/蜱Ѭ vrOwIE奜4k[2uU.{+Q*bsjN-gKGњ5)UUf$u%d03Z&#i5RMƣx)l. &PB:J ކ?Fꙥ_iZWZP~ŀŀ_P5=s:e -#l<:n~~JHj2)HQ<%?.WTȆǦq|>r0e}=͒dT)q+"Ft,`"K9v֪PGu;v-w)J/GB/:W'4o#;!Fo_(,t;8;>SYӂ/Q< O"ad{}:&Eo/1iDϺ>N(5`͛*7;i<&7Oipܠpۯϧ9n:FkF]}48y˓EEQ8=>JFst }80"Ρ"C~M>}TI|o:ũfXwf*ay|1h: U%cRT)T ]~S Ru(37Q~e'0!‹,ѺK+D/ s0P`7Og[x7_wN4V죠2:$\i,VmB1>8p|1p4Ԅhe. 8A~y>7ElJQ9-LT8(oG +!]> FMWل.9Yxb#iNe*Dݍ {HR&*dAhk@q0_`Hjc8Z܄VU!鴤BIBܐP㰬<,lAkup\KK{e똎iR.\M)rszSzsJ8}wx4AӔ?9$.#_]J@C|?l ɂư"z I9SV3xfES>iz԰8WU yx b.&&#݌Rm˛=s)8KKL6`.5{{ռA #a/`# 61Os) nWR?m<$~FdO6uSQ!n~8Zt )GL==-_YK$)9kݐn2uɌ{1MWZ voqun.D^9| r]+:lJ| yrJ# kRu9 !ـd?C)N aZ9Pƪ8HKfmL3f쒱Byv\Ђf`͕K(/hwڜ˸VWLQɓ)CRدYS<ólų͈e;.{:cYFlP`v貧/V{9aaj:y*ox$co l96?Όjv&2p:GX\g]IلmIUcCcbjE/VDmтZ\`OMkaȓZ,PJDuåuW-]ACmٔ aTb7lW[\L ti^y䦏*ӂƼ1YZ =!OGhj,/Z}+[#VdӬu_HSX4C2 )K0*ESu?ϖ!yd抰]FSskeEr7+ۙD\ +"EpF]$y6ts2Yu!XbEtYu*_X(5dV%?HXrU&JލeB"ܔm^(4Θ؞NlaS\.X"hcm4u]\%rp?[^0lol!W[WmzKU/JmQoڜjD 엳X e=nd4+l]:ݺHh]Klv9B  0 5;E;VS..mybչc-_K_e'0ue%ƫLuC`HtԒűKǢfU; Ee}$S7x˜m.tqLYhoyG#l"j3 ϷxHT+PEEZX~uU*G<=J%nBT] CnM33ѻLW#oNOQb~q]ͽa`Cz6V؜5OMk^ mj܆*S=,;\zC"yឞ%[}et*g\^KQWDv"ceN\e,T*-$~~&ο]i-uu-B%k|o>mDͽaUv7à;D<ّ.%ܕV#:`6ܑVsK4$O6wӶHi$kSP-ʘ![}99FCLjŵ-|s(SmX;0`8>ڈ`F ] XqB ^n<M}K=щ&!L2``:`ʾ ƾ3 2Gb{4ywts:qvd/@w`}sGFg72L=ХT@ N(dF`*17m鲟GlNVf(tvf6 vfX LlegWے p~aN?:TL;.)kdAA:;iL(@O5y\M mt;̶Gs8$$ؑ$pu>9v;;0P/>s3ԖCлrn\cWκ~`=@x&,ek $fƓqI`cQex||ۈ)$@\ o/ ?sZ=[e7wۻޱm@GUs] o㶖+[$ER H3yd]3dINtǶLv琒M;:ҨwDIez񪹢BMƯcSU7~umk,BQ0JnR%ȓ*NiOP^:*.g1Sy2sG@"я&B P<:!Փ:…H>QH(9 A@ t0y !EO\W"ޫDQGP1)(Pء>EC|q\D<}p) !XPC]-=IG(O(Ko*Afp0@A?1(n ApOX ԙH"K!YJ!i!k:!V?ˁ<t !q21}0NASQ-E?i.bWOW;n-E,~"NuN!UIb zu bze)5*!Y2Atf (rlj%$W$?Mx&I=YYor$YWҗ)b0QsK-I:$v.]Nz,?TIƾUTɥS: U(Ɵ!X/+Dqg>Ń*3 ;|v,R\ P ڳ<_%zDЇZ%i_j=IiU h2RӶSǵ[ǕCEʥԢ*Ȁ~?Tyrr9pv[%8$|uSX؇%Q_p9ۃLJԯbRotW8"i+Ӹ@8R 8hwJ [p؂* T$/5]Vr=9m!9]gzBBСrܐ>AR:_y^L߷QC#up4-M4)jRΟ5NQҎa9\2UUѬ `gcHj횀D@pgd^Z09N+}XZ>@ 9&>OWdT FI5QshhRq/5lr~ٴQ?avVz5_ip±y53FfQUf(5+dQ-GBrJE7b2|\EQsʕGX#?=I6-yFzy4@<6Q$87] ǚfE4=:oAX~s&(Ny#3q5~$G{S 7@6cՀ@]7(_ j>B&,Zkb3Hyh2JW^%(|e\dAB -|*?P X:\`Z6{>n~kPiYL;_Cҁi^ S id 㙃1 n%tAKG%i$RO{c?Q^0*m7e"Kf=<_y"(5%AEŦ (S/1 !AQ?S.L~D0iH[/Pbh7gh^'UV1%b?9Ne-&5%bYjM7=ӑ]݀w].1.bj@N"S\̇;:hu] 0U=BM3MJ;n) U))}2ɖwo %x$SABz缰;FȎk*&9gwx#&5VWĵܝY!9>\3IZrtБàd3-G0?Izg+0y9ƇIw7+J,_Wynr[kQj#O|3g0]G%[i7 H%0,$YM7Caǹff -m7z۳C em{A<_Mat83/_糼J^[o 7ӆ /}Ά[f}^Nq_ t%6k) zp'03?'ss2]i J9Gp\e$bQ M1?iR =̝C`5=@@/2h6|~6Uoxh z&/R9ڽNSZgG S SؽZx.m%h/tu˴@zk `Kӛfa6<*EVT lFj-.Lp>k;2,:swsfp;Yt0{kYoΨ(O'0֑wl>mLђl9OKy!6{&%dy~p=Ԡ uݰ7+u;[m;&Tp: .8_&l@L*GKkGb^ Leי'X= gNu_8ɸ|$kf:Hw"OiY 0KP4J$<:Y` nkϭk}$,L`͈FXs_ɯ7X_% 96S0e(bj+N"]w'z".d& Ѳf |&vffpg\n63,F7;WZ2 }6VIY!g2K $r+>h%mJ~م*L,2y3i2]TݳrT2DzeriL&&}_lOL^KW-&j`O܀9!?MQI:~W5uGN GcdStO^6.;1LM Xz52-2ޞt?_Y%5i͟ĉ7^8>=W?"`/B[P:붾)PP}3 @$ă(jMdX 6VǨUsK!ۃ[] ]^q˲n0Ա< YgHxU(E":/륧5oHvI3vc}L-xxӺpc '+/Ia}*ygT P<Ypd..Yy3$ L;A:DԄwؾKj Tի&m׫=}8LW H`UE}(%UUCoũVB,I[+.-<砝n֡j z)JaaFOc]gXfM#ˍjÄNL?⠣{)݀kǮCdҕw0X͋uKy&{xX +"&M!@_~8T@]zN%lFir@o`C1?UTyvj",idbXH܉%L6m|˵w3kzM@ sde.j5wq B^IVzzԆ09kv ] =[eMeЗ?a˧R`5;Ztw \݆ɘCSɨTaAtn"Y>^㴨hq#P@!B=XGJmq16#r!VN\l^ V^xd5:!Rބy6_{`'`&="J7a-pZ$79JosZk Zidl]Z(ZQAykF#ӏc֯Zy:7'[[]X ݙ6e&XbUD_/IYjItx~"(ly9fGCcCox>Il7%F.@ihoܘ<=H,W;S`E2=I69Ƶ.˪Lk,2#d9<_=]f4Gz G>ӉaatҌ6TyWVdePʩc `0 կrX()KE-vT#ݗJv=ʌ=N>O [?9|ޏ??~sNQ?j\ʇ@޼~}?'7?eH{ͯq <:7ݩn7ފ7Ie~\u=xW7} ~)(bpƧ4(ʙ!cn7qy{.Wc7 X7vosejTK ;Uqp-:=q_ ŚI5&9۰/1g%HͧGz+uM uz\x݁ˉkד#vbl[\[pzt;ՎT~ݱ݅=U\-OiS<2tem-!4IHf}O&v츻$p ^Y/@?}fO- pJN()Km,*-gaİg8גXb[NJW1f(Q M3,x혎2/aJXg/BcaF۱ЁFw:\PZKe&j(Z5 F,i|K#eQLKX*&)Rǘc%n(·TLh`X`A㷶1XS$Nr#Tܧkd $5}(yc1bKMwd/(;اo d-Y ﴵE^`B:I5 z,>ez0b LZl0#\bi"[ì>A abd*d:3!mv,o v,*!S* LhAzL!!h\iZG]H:f\6$8rU%Yi#5KF'8]5& |[ T~Hդ3&w29010گ:oݓ?b${SAŹI78ŞVpjba}I8|1$7״A V.s L I#S2q#nO, xrZ0LSS.[_o|^lI'3 hB@`BXKz\H*e=6\!@2}t=Əill{eW"Z-KEh>W[]A,W?.S\/N_z WIs`gJy5Fɴ[8ʄ"y38#i@lOqS4C76[&8Aet5@`Ȃ#ȱh \Vk%kI ٲ : ldF#F 'Xj+Kt넨#S)h eH:e_?S*ynri*nuz> DJΪ5Z0775^Ip4|(RcaJcֹGK !ҵp[1b~CmBRWawieT[Ew{Hpsc;\Wʃo<5 >>(W3aP״Tc{!)kj N.`8UW)E*GXF Tݳ!w{P]@$9O Ns1.g2JjP,$o9̓F0b#/߲'3 r,*&sL)(Oz>)iY1=:1l}GC[<|t2 bpT)y^W/ 29tG1Gip54K+{nBή`$vo#i!5;Pc kp+>UYxH:[Y^$88VW!p %}ᘱ rFX>tQ13'NvׇpL{Vlզ .z33 s!pEkZL! q4[~^-R"F*ըXR22 qj0.mI_GfU:AWx)9rpBqp&]"U|}ଗgX /{[aV* қAHUBc\i^e p'#i3&qp76=Og gSXt n h 6 e8~[$dMAEN05/|PUĂ,`$z}yr$8yEm̎Fϙ(Q\$BG +͏ ,$e 4ܬf$s|sϋ϶!҉jt:BH}wLۯS_uDK;"N&W  rAͬ8&FCKTWϬ\lgV LZWLk#$>.>nob6 gM/]P3\B3X-:FxlKFn_Ip0{i,=1,XzO6HOG1MfJ1-bІk. ďeAqQ7羸9[1+zԉ3GFY1b9 [ w_^=Y3 ^86$"uD2׬RjH`z4Y Ƥ5${GR 0YIr`K1& rgs#qֆ(ڱ=3!r!}Sr#:#^16@m]Aaco(zm]z'>L] JZr hcBdr#Lޖ{vWpf5)5Ib&˭iJ̪: 2MmfMBh|Q4pvR '.-NBYWF88>kIn9kсi$s&NaH0bH;)yEL+p`NaUeyv[d,F95Aď JfWo>VtG5"QNkRvUd~\.2QՙR^m0]ZOjK2tm$JN`&X,]]iŀ#gHtf7= KOzh0rdʁҺ}/y\o Ŷb$@ƒ!-Y$r 7rV6{BqٴQoͬM {~0 4tR&L޻6q S}"ΠPZ&̮8dV׼CFc0lqCwM?3 1.a \Z)\YPBk,έTd#<^Hp C8QvȳR7Q a1(EL + 9d#?;.?pČS&bp`2x ebЈ!i@3|Vp: #]eJ3y-D7ēüW˭#qp# 'Q qhp/¬fƄ"JƦ0 7553vXR-3QB +Ĉa3'U,D ,vhCgQb7x)$r5=k%A*AWd #|('Qkg U61b:\b%6ħu= 49+98ܠ?brY#~u7}u4aJAΌ J)>vA ^4V8n:o͋DxؔгnƒmۚG9/ŰhI2ʨknDQVϪrBوif0+GoYwhiq LJ#dx* UOԧ<#j2ʛǏOuCuIy0;}ndde\7m}׾g~Z\>,!q`,jb 8[ji!$:fKC-#n0GYmF*3Ђ’=sU 3^j{|' umyLq) CD}:B1z*s^^5]M*8rOfIpDp'콻1i^X&App R|1q-ESk3Q` O,$o9̓Fva$SFo@&óy3 F_Ec@7F y[dz{J\-wtVYL; YR؃KiV.5cK}룣`ZѾ5J3zۥ߇G&UHSj`ËoeJ~lL[YK22ph)9I؛y 8sFdTsa|AXlI%>Q0DQWK&ZXj./Pn,I!p1+_p,Q Z b0]ˍ%^'1/}a[>c4F ,Od 38OYMrwAf7p\s! P]]cgn,{oyç3 Ikzp}TpY͈ۆT1u`lpdbHU22SIփ ea0J㖍Q#ǭ7pfnBaO7]"XW~3xU8ӑGO|OO^գ羦O>FuZOŤ0?Q}'QNh ?۠1Gǭ V"xe]Q$jQ$*gNL反UVU($PY~}=?L@,j.Rgǿ'st](L@i<ʎ`㣨 ڳ/ DEs%6Q9'3I &UjLh૧m }D/s.IJ~4{܍𿯁 P4sJj(RupȂyy˕na/[A_}?$?h\{,:g=K>X2*}"^ag砖s,BèK,Nw9TU^>poj\Jzs^_2.+q*rDnX"7[X`}(Е bG݅!"1gwEYطgێʹX, fAֺg=`d~E1oC.YWRd΄L;fC39?n]QȏfwT1>ZZ8B2qUq?p~\;+ \]t8ٷ3}w%!G*q*3,Zq,kWte/U"O_)s .x~LE,>tm;.8jH@T#ūW񀾢[ j y&͊5KHy8ʍڣFa */ǢrURG~zC:~:GG_g7s HX?pl;ro͜mE- B[]N$JaF62pk14i;{K[fMrcPΒuu>Gu0n2û"65GUM% 3TAh !kPD;6~nZ)# TS$4MJ=%MZ]Ys7z`}T{{5BVФIA#]h߃ݙ2risHQW&+tI@o4 r {`0x~ÖZ_ԯA$:־$q7RG'8RtrѸ^9k*0CWbQ"9X$XH\b-ًIjIIYoYC ^k*Y&\hIJ)hN ˌ -8$A 3ՁE]7@vW4GE|y:^$l=er@G)3Y}K4-a,o<:"R'rmq  :z8o侽o}J=k960n^vyBjk+9Z)Vpj³ufO#%;c[_~r"V{uYJMVHI FYuzL]kTl*}s6g9(p猔(B{2gM.p`EcZ]ɇ+lYT.2@a[Ӎ1,b~h_@(c~cFˏjHPI1*۱c rqHϛ_"- YrI )@q8\xlzi}M1G&qׁ:}!Px$zG,2EL}Z;;gK8{oK_%M tUOor"m,@׎"$#4mZAO~lz2{_z d ?Sb?a#/ps~Gp?\ T|7^Hjs&vN< 6Zf7>c΁G!\'I{00+c۲ @LBIJ_'ʏ'p?BʤXB5h Xlsٽ%R^BJ?R-\~ #T+د~[*VVhijgcϗCAY2 xZ{-0/ވuYˉG̋H*M5.R; 99%&Pgu2L VB.Y)1Bw((fOP8\7`MRYם]ڝ^ҕeeѮВ.LJvn $!( ^ 0G*7.L6g ALٍ%nG$n5$[MzY9f6-ƪß20)QsOr/CXP:cɭܢ =؀@zݎjzo/ / n;o}#߷?A:1ws;v'sv-wݩw>]3W'T/\}0 O/!3{Dg91a9Dp.C)`+D2y|4ћgj;_;?*;x$"5ҵPw{%2OmqFvom.9H%ځ`Mz$7搫H-{hRO6Ѿ)K& Ӧ\)Lۍ2papTAlz`w'/pV]+n|wjTF, 5(k;.\jg}c7E*@KbLrgҲsXF)K^Oq*aH+3-:9.Η$޶mwu_ }|oMc~Tݻ_c%NG`KOn qQO;߀1o@Bpv^U',KBW. EZRN*Z" q͆>P)j|9j.DZ*_LY%!J98݋5m30.+ufew8 ZIBdyEI ;Tu0mNJko yd* hB鏯SNw8p'.yk_Ǚ#Qy FqpRa$ S9:F0%.𘼌^=($mIcRo Al§~,ԂƒVo/|U Z@Wڠ~@7los4C#+$MJ\EusNgr MI\iE1Y7 dLC\a͹q!XP`Q%KAq(ε9 O"vdLbwpEuv6kx3osL ј7H0m)ё\cB4!RxoYGU*mD#š ^ (|EzdisIL )u&)#nT-@)qrz.99J0+c[eh͌!L.b E0aÌ"WKؗgf\(Ad1քO84f941 欚 n,ZG!lrK<9;k8zO-]{yShvd.8V;-z"VGqt ]}T*ـԨ悐&nZ5e֊kcָQ 6(#irG_aqnSF{EEm1ds?e׌[UgD*vllS .h܍GɑP6@6dRɃ۷-n߶"7pW |/\)ǽbshʜ>UCL1p QR5G}7G؀ mt ĩdHF49^*7V*Ǹp+o Z-͔$֊3p=8Mnk mn7H ǵT*ƱѲ&WYHTXt. +n9!N@a=%ۑqڊZnD&VcE$S Ƥ˱a%e0 Fi^ə+j &h6|˰ Ah.pDD܂\p[2֠MX@T5M1dAK\ 4Zl T>EF*P>p`5מBDo(;tK0klsuM5\ ErWq̐zKl0 9LA*Csw-mh =gClzћ6S,ch#˺3lI V$朙9v"#M9ڡWӴOӌI kN3 8.?s9T=jK|B3p̲9oqL]dg\@ 0WMP{$e?Mym:3yjQi-3E2|7Xg%2o"E(@Mz$T3"@3qKYףMՌvTq}M L:r":BB*T9_|N+@ˡDjٗ NLai#!\)8GRg -fJ N,F5hAIf;V8sST1JL`ٍhM~&OZp6n (1h1HZbT4"p3$+V)aӖ%JLSnXJKrPE(ѥB`8W,ln d'!;S.tR("=d|?I7xX֕~pe@Gϸ" H"8?*BQxPLݝ{+8ffUhvѵ\=Fcjqz-Gk(u]i8]`vp+$Bxo$Ky_$7To7JP1(]ݎM)ZgF*ҟZ ZKW[HY.!+f%wgUeXa]7,5׷#MHXؤMr eĸ MylIBpЮD) 6 H}<6D,XyYaHs@rR 2wD$RV=*VNf IT2إ̴Le:"D,(Q)/쎤?*]xNT6-3+Q̰pTvEA%@8fdGRi^LChaRexo !x¥a+yxBufѧDTsP bmݢMk9n㦼wսrJt<\jސVIHJW`j>x {ze;jVPPl>_[NWowBO(8ѹE'(z0 ~ˢ3.1j b$260&5.Z[՞%.v eJy{T +{?:k–@k8##% UHAz$[['ȹ׋*!lB /$oK3ws1" Q6ZC (XTqtF8 i7kDBmbV L ,~jb8Γi{[+/4{D: t~?"pڹnWء]'~RUnT]O'U<;9I Z_{R07Iw~pu`Q9Mrt \ Ik$UQ&o,NN lp}ٛ ݓ.WHKZt9ځSZɻjNJ}_̇ͼJb~}׼YC\N# ٲq_ [gAE]Z1|K]G7JFUCVAXE9BxVˤZtf'߿_|Ryokk[ZSn?}2+'A?_P6^&m_k__d ?Mf֘o\jQ"Jӓ B f\A;'WW1 <+D =<.8Ijɽᤴ96mpD,^iEr>r. Z9O xcv׊7u&u[-k82E7>.x$: > FE-6GE6͛W+|8\_#VճYvZ?8zrdF~Z@9AWT??GjpAlRvD"zIsr'I:;d-ݐoĐ{ţMM>S(zM ̫VGG~WWWS [AQzt<)3 5)B AS:Uvdk0/flV_xW7*^=FGO޽l=~JvX1j.yvξ%:/&2G/<8s[̓#3s^-x0@H>U׬)F3Irqz^Gi:C/y{{I#,sv_Жϫ&%th'SF4{]Yw4sm 4>/1Q節%#bq.{|ms@ {t]ߣ)m|{t~cB 0%cKxAKx!sr~`OQǬyTҳ=b~YRQ?wt_G]W\iA_g`8JG|ݸqCoﺽj1u6R5;։o4; t Bn!i[sܽ`+fm.?N'y͋3[v5쟹_\MtA^ 4ZaZ~-OY{x$ @EV'[g#w[7ȷe\9 }RG8?~paXS +!<,֚!8wxK#ѣm5ⅲ ?J%ާ|rmCj3͹h0{f\Tq\TRJ弤"ϥy| ge,ٌ \K/žno]+ sj˶3?@;,~Y?Ygz4$<^ݮA49JʗU;9}o/֨Ѳgogكe.uE򇉉@'\'DhJlӖ㦸z 4JIJ,{Ak*? wc4'!-la-dJ+Q syp RJqVk (չ)= ﱟ3?B;D7j a{d{r֞BvKmg [U]?.ޡXag%8Z]:MiHG8;˪t MwewfiYF@@"u|wM7ǣG=}z*l78)x:)8@"c!R Ia-Ph8Ia-%*AKCW!(!cm==є:TLE233L/ʰfZтJF߭Vd^Mۛ%6]tÄ[Wې S&:Xt2E"Gsl7m@6"Іݘ/oqI 9TJh[[#oV<'ku%dPi4Sۣ|yR;=lFeTemݍ:[S J VO/FU b8T4p{[nI# Ӵvbdl1Q0{Q`)N$=ur"&IY!>09 Bց?OzxpV7fCP+DB*Ԧ>X!ݵBD˝NSh junՅVo;LVg'vC^@!"nq۹/ Q x[f0)hT˖JAT3CP t)fl8!bR?=:2Ej~1 :5t='|v]^H uSr[XEyEQPг{j|(|zrc(f+V-ǭb"c2 ż`:h2a|{ |@^5R~c1ќ7QxޜvUFIU|9h]'e+޻*˳tq^[bJ:/_up=:1pMQP66 -˩+oN|_y{~Jg]<͢Q{f]>JCPtWE\-SeIc`ug!їQ~E|}JKU,Kj4U*o <4Omg_I5hYufȱ {* - AUؐ4J WM~&:&%\Ċ(cȠea.gkAN Aв@W%ЂmگK`^Gף㎁`ub.̘o2IaHՒٜ -Fvnf˪?U4օJ9b/7uk|kxpkyoa8Dwپf\ub_QDaZ>3y_6ěmp}ysrYh}͙f?zVQ.M7V{|8H_8Ę懂tcm*W&<ļ;C"y#H)) i!݊]_]%Ż;srz;EBܱbV7w+ί?4Q_) ރc1ﻱ"(s>*iJܵCnwBM׹pݎ9ITָ"sצf>w>wON!،9gz-tNNZDH5=\ę!7yPnϯ>#x|~{|\328qU _) I#?]_~] PA`Tt#P^N®Wq-J0,hy|qW'VO"@!ۿTfq}{hSHzH݆#j*?f:69oGP5e#Zu v8w a6[R[)euݷ;j% mE[I8bs9CJJmH@ָk@hT8qѤUzڸ<0)_޹[ןIH?Fc wd,K:ĥVAumk蒍HZCTYOEoö-Њ)CLhƍvF"%r}EtMk֌ |؁f=(2_Ѿ~U*[䡵]{_ կb3mJ4("uzV"mv۩dJnfƤ6>xS'%UwRӫWg9ΖI5w}MhO`?0wV\'SHy >ڂ>qv^K$+~QzVBzS.oI2}%v_uli> !=qC\Ŧ{hCksgv8sK>B-Hmyh8r[+J˚6E0oվEB;}hd: J2Kžwm6X=iF 6vhAgiX<]>~>sjGV| 27r8cƺGg'z ; Oi֓z}dU1J^כ|zݖAz\Ռi1ISev>$~k=طڷz^rd-캆P!}g&#k[C]X>ѹʶ2yh' v.i.%8q#tE>,^#\ٙ)چZ.D'Y7j vw&<|'*ȑe ߸練]*MYf_U~jE`~p3'9b&-g2rdf~oY۪f[a'Y9&PIg`9rd&b$K(U\dvMu,͸ÍԼ?7яM0brfظm\71mBK<^lW\;WgXۧW'W~v`*FH\܀y9jcΚ,; (rwO}k(_2K͟1<;qSF<.N/PR*BYϚѵ 9zDi=8^rx,^.՟\dp/.-#wR?gyhմ8\$*>XvW~[TO/ZR,Bj_tZSrՒo xGmfPǃ +gS6 9}bX{_7CYOAIdvy!l/o߾]Mm'?S}{r]Uu&ϯꛥk`]iB__EeϮV7ػKwy>c;'/1ާ49u#[]"d=Fo躯g>_~/$zO{5%ϖc1?z\].Xh(ⵜ}uҠuWwә՛r*aUTڹByL*MŁ9LkpssxjTOF#@̷=;DߊvE֧sLi<{A˚g/艄%sm)yœ\'5y ͣVE[u$޲o^|%$jVKv0!d_Vߖ?^b)Z#c̶Ұ"%-{ &yR*-zm Ɖ6K.)>1i&PƇDL$ e1,i79dd#"Q['{3F1%ira d.w(onb"%&D r&Dkr9CCe6XrN! ǥ\`ə©$KSA ZL%H*r<} .TLA6E1ΙI+ Z*TgƊsXP-G:I+,f9$B>edٺLdt}h42K'E/6s2O M.xH`fCC- $ /xaXcֆ!h*E>bM/Ƙ RZ]WD0:Q9~'_o6*𑋨#-Qsc'0(MNyxX|.f` v XE%b%,ϊK&#I,,XG . Z-m,쑵 g=/4~,S3(KUYx8A,2z. v9Rk|@$|3.cHJg0c)ӠU&!&X0Y䠙`*RW$E-E筍Eĭ! :kvDouavu6Rh2D"D `gJZ@0X``\D8B{d#TLgPQ\`) ETjyVgV2p@ :^K z*R#Huѕp0;h*)0>1ippK#mp~YBb~e^p <RDAEb`=h, $7,x">f؜:@i0AiQn١[]b^\1+98_=LT.+rlsL l5xu!ڭlXp  muZXI>&<{aa3R9uTw%%'^b=TJWڜK'a [:@/r TN"Dޕ6rd_IЍ)Q/4`"m/1XD6s"3ER*Q"K.3'q8qӢ@e x&RHg::(= ŋq^; y搲87ny(G Hq3"BnQ'%A3YS Y?># y0,Y*$I0 ?M&cuY FB`;,H_K_nϰJ;8qR. +Kk!9XCCvIVnA VB-7 >eϳ̥{n H`"2d/%-QBz$jP $ڞGH#qRNJxD*d :zd+3J 1W"  z%h:)KӝD*d=L #JK8=B ȔBA^+Q6bXoWH>Gv XW"&ꬍ ɉ!Kjk!'0D?"VVdΗ7(ưɅOB*ň "k!G=됪,K`~x%!(GK5rNebȍV\OLDI|0$iC2iCB‰02^)Op;MKVq85(cfƲ|ʋ7Ѧ)p{:D  oKPh{ #Q̠[</C_Xy/D9%7¡G[k|T0KZ{2HG܄O",P|Lq&eH;sJ`4Fڬ_hK-1`&艆&%!"T$> %?tPbp-d r^R; ( 2C~W(}ygͺ~6[v۽a^nei. o-r@8xlQT>!Yuo& '?K,f~Xg}OַbZNE]."'Oo_kew)'{=!➱B%w?QIrLPZUIJZ[XJCӁj@b!ہ P*ejU:1QS{,ߤnXgJScΙ[5rDˎ;WwK5#T ,Ӝ$,ܒHVCԒk=H8Fجeɩ| bIr؆ jǜm?D,XrJV#p[rjw"YfC[vd(1lc+ 2B١ج6R3(fOcVK-?`lBn+lJF##X3&,+Ӵ}!֝lzfqw}۷g+Zӛ<4~;Bb `[Z3R+,S \4UNrXc&T^>kV?P0lBX `,ݚ$w֎ $@Rkˬj(`!sSP* .͡ج!@Z­f XQ)kXc-z[FtII XJaT^>bkcZ{6OXmi"m414o\<ߠ'lI}9i!;/<5bhKec̘25nZ~g]s? nlrNj'7?/ϑ,^_酤rBt oh>D 7K@no>$z^?uh~_O6R yLyoO`WFdRIf $Ӌ+WKx{+o*W`\b_Owt|4lI:] 4rFg)#!/]"2dF#=F@*xɼ> Ql-ElU 5٧d1+L}6((<'rVNJ+Zcc C"YC1k 촺8 Sdr8y#1lBQIlLR%%9^>kY]](@ ; F3+9wbpl `K[=J+i;LJ3 Z=w;* D6,ez `Ȏew #@ &`JVbfiE.{` +o֖'kOjmi{(nk1a3x[m)Ƅ# 8n0,'v6^9L[ Br>K*@*XC!v'`5sE4=#oAh^_?1c B~;[F0xmrLlIUG'id*oKl%2Pڥ#c6dLp؇ܹxEgӛ~^9veG!fH&{ Bo,jÕKJˈQZ&߻9+-bS"k _ tG6aS8))9hk-2W!FS&Al[j# :%,b6-qa2 #Z)ϑJgj+Zpkא6[Y=JE'ZDed*G&%V&*\"$[9 0w$&3ԘS֌3ˠ,d}`9Gb$csVwoES(GA-ni=-CQ'3BTc nCfQ4;I`"0Ǩ! ZC1$w1gsY;Ƭd P?#&bjUB8bHN =-o\HL6.P$HsPTY[+;8~jOBon@ˬ T3 KYKIHB( eZ{:_!iI: `,Nv``L&/ ~FDl+NEֹ%ZHYnz2AlOSu-tѬEkA3 El3G4o| oYAӶdz|GAw!4%MY 3 dZ:p< r,*&9Pmi>j$\XU1<@xbgQ8Xr_f`!ZyPK()Cdʫ2f'ejEp `s~y{* ?/eڃ›Aq*qt/=ɂNU~T}%*y*jC@Vɒ. Hb b~z}wy\E$S,W zKBx2= RAjIv:"J,tҗ 7 %ݨA׀R"pc D#`JɽV J 9h v9 ܱ ,P^iFhQ"=@aRB^`(C*CxsTu`-й'GF$8QNQUY3fnf$Ac*Ei L>{*T'Q !W(-(!Ts'kuNu~kP|P|y )";i*އ@&(m@8XB^'X)*9lt8pԋ!+}!1&pJ$n؈5 IU{*.#GAX1T*`$BU ׈4`yOv1KLY#WfR"&bQpHlTo"I!:.H,+9b4rːTYG5!wxR `j?cЯ.(nZ.GaP/ 'fS#;St XB9o~QN5r)MTNQȱUC8u 9\^])l[O:m t\2[7)tQ`@y|'(w&scw/X7 7|x]ޭ3l:NY t>m<%9um:riݦ@ژ-ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:ݦm:HěM't}ģGoMhӁutMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMHm:Zm:BMǦDv26d\ަuΟѦtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMtMHl:Ͻc=yݩg'_IPnpn|9[և'ҡ>sN__0kxjfzk 77e8}uOTt$`꘳ Q'82r PFt|`5gwd9u4ז}>{yQD9jv{NIυf׳(Py|N:a&i<200o?}ݴ љ0"a$:;*2Op\"/կNf_]![?['wod|l9YiO[uCwˏqY 7uvrc8/9r7ОpXB(yX(}Dj#5F);&TnA7:Tb6C&Bʂ bs[+gFa%~8zg-'008k2hxEa|7G_}L ]ﴌ52!$ULEd-lѥh'}v|h';>2G է`˟jcqX]2{V ?'܉'9^<^ѷ~w~j&)祏T}5E"$2eۿ~5v|𥋝*%[á5f*`QvK c'XOBLD%hۘ`ǴS)f̅8AYl] p[;Wzi|{ p*D*2FD${`66۩c޽L}wyQĮN-AH2\KJwXi Zix8Ix/hD6 BL%gTz֑~*$&-Ť&B2rc 3!g(o9H@ۍAFo~2_l͛xaw Cllۥa`ŸA.mhvfnڥģF\/"YO3ͦ'>o>s7g/*trRm,ͶdžN fxNq0r;Gxvka vBkV48 햣-6v;Շg/f@6Ďț2 g_ood' ۖ57QS;U\]0dO͌`ᰴ@>z;?|љV3'腉ūB$'򩠩,v"`5ҳL뉬/hgoEτ+;ОR-) Kt&)O)En?B DʉNXc]0z"`kyz*=l XS`c"t{tEg071_q2>KWm1H3ᔤ,yO)|3HVy,hqaZG򰋷;x#ZxowJl@6#]mP_xX!7MSY$$90S;3d'D:w MwVi9; PODuOͤڟ9t:*7e%/~a7Kkf71ch rWۣ>JfauuGZ%dUeP/F|Q/~yy~\^L՛z5m=j_)4OZOq[ݍ05NM. t׸Xr8Wjrh|vp/y?c{`!}wX~"i}.]7҆-w/w{rugFvk<];$` }k]*yqZ/eD"G8&BVƭsG*\RY*8%r^vԖ`N,jˈRP:lzBǣPhqjKL" _ UGk^0r_7jk/x#Ù$y{]OqIOډTlV8ON%ϗQO>(XM,:6H3 B7k9Ƨ(9޵d׿B TKA bl0Y$3_+ I͎SMeD=)(}}]ۙa+}I|0I޷s:IH= #}5^g.ǃHXֵ'{!X/Y=A`>3RBE9~:I|$XF6ԉ%o'Vzk9eVy==SY X+W_X'Hɀ5xV>qm^=}|0GxpZ8 > 0|\ϻКX߮.p^9~A2C;,"p2/|W w^ osq6)a%l[\U}w?~ݏo PqgY^[;p7rh)۶n\owo { lOtyby{ۉtv=|2z ٧%7߾mnݻ`m61\Y>?gW7Զfs1 ?/綕ʿEν[arֻ7D?wbostY迆g wP;xIglDS =|+\^jjwH7⪽6/0uo5|n;1"͇6Lq\ !-?}pwā=soqpq?5-W|"T8on>;{8l7Yx;OX>O=yk1$Ç9 QFT ܡ|0/0HLA:2_`Ogg6:dI|XWw~MxN-D\Aؔy.?g>sno]\D_dW] JIj!Tdo22b|mG,gz}ЮO% |jka`Vb/p9YjjVD:e8))yo.Ge %`B I)PFH- 1fUaL\&/=ܘIK%3ŧe^I9-Ud8sgk,ao UD400EMv')RB)P鄋|)bN O̅$jPc*xҌi^:rZ]],|VSR[ 2itu!E=b!#|)h3Ji Yu2Q1Zs[LDq06[ Ox4ٳ|.Z<2d|lb& ]BP,IfxARS # 8SD# Vgkc6: L x! OX] h/<aҾ ?eH! g]H$ViM,,c>:|bmb19 ZT%Ċ4Jm'A$ʘmQw$W= %gW@x}-ɻ)a+)(Sp1/Ql-%"_LHM",^O VPR,@"QK[K]TJޒ41[EQIbR))"(,ETP Uv!(dh5"aB+@]`I-KD  9I KTQ=2La"XˈOhlj lgd)-Ohd,|n|`ZSΡL[Ψ28셲d _&L9$EQQ \U&1āN iP4+`eY3sL)˪ Cp1ڢ(M0p݆s,Z&x@ri+w%Aq%cT*dJc`H)V y XNUg%TJx*12V(' kOIBf6l`ÖL~JHVG:(١A`R4eAC"(XRN 2zU|Jb|Dʌ ˕3f$f6 a Ҕ`D "d*Wì"sʈ5&`#f;rląmJ8tRp$mv @8X2`Vʤ|%"LkVJ!(96Z0J8 92j(Xt5p˓ZERl mW AO9!R须2ZFSV ^qOʺ@J"fJ&{"Ո yE;A!a_U@i"j8Z(%@ Pe4>A a+֢ƛpՕb1NƉ LgOAZ=ڃ#RdI1 E%ŊԨ‰rGy}`L1O7mQ) ^x1Hu`Ky@X!Ӗ2)тҥ`ruA!ܺJ78bgP8Xm|`BG1@P$bBW&DEɥaZ`b|OA2A|QPVGA)vR3< oŭ XtuvR& :UQ(ϓWwZyn[M}YI ' O+=SpƝCLEm]LjH-v6F82=l^=}}Rf B%l}z /룫FUuE_J (|,CJJr -hG;f@@'^ ϠB.dsړrFP4f,pZ5rC%p$NʀGh⢝cWv,L$@fr`I$@?A jءwG{c2NIE!ݤ4̹HU1,%N$' fcI;A.vtRœq ,FVЙЈ7UsJ]zˢ E.ª: ҷEOP,'`@E?ٵM)jU3vaHu+m`-Asn>_t,dei7ղriI 20quUK !)p T*mtn=v3ໃHQb1j]`֐d'YS@s% k dP.=7&3v <%*`r(P/h7<"dk4VMr)pW A2Z`hH2*$ 9`ՀzGjt}V 떍M V"bs}蓋+fcķ(u63U vbB!ekz ca>5#>qǫ5uSEqOTQeʌy8cs*7V(Q8SkW 5zIX/$r`ɑ,IXLTЁ(#\5%e\4ҵҸe"iAt^S]xYk$ξLH6R ,<VPX?i,thYe: 4zȕABb~)Q#28&=%2ZO吣ЉXe(1G%ҭ^0@$\t9m&ِCL *VN: ".ńEU`ZƔ:xB. 0\NH56ץuk]]QwgP^me]-ōvpX%2;g3W0 XB )hNr^NPC_7PvvUd=p?pysw՟g!WE.b0w˼~On\gƒ4C[Vپ!iߔM׳h+%m/zu752ۮLYLp5Lhy2w˽.t2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.2.y2\şL>ع'#Z%^L()L#HO?KtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLtLyd:^d:^8tyNCktJןPbtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNQ*\n右W/J6a{}fTJrupz-lUwe6LrW׫墝ك. _#r v%@?*m~-lּ7wž׋jfvX3/jfzm4eF̥n{^ btKdToRlPjn!_Wߢo^͆oJߌ PwڽWw%eti&{0=&Ԇ7`Ӱ\z6Q%W7Wevrېޝ~}L)Hg2~`)[ypg KDқ"/>L^y"`,ʼn5Rxw*`$4P'/'KyDȎwd^$}>4Ǭ q489ILHe~o:Iǃ%,%ۄmT,mM:ZZm݉5N^8N˅'ٕ_lQxCz P.:B;6yr#'{F俊`;Pa`p3=`w6Ɏ%Z_[˦ܒ &bկdVJT:QBiK[hi i g9vcӉLb%T4'jJb4 ~4.+ njvk$WvZGÎYkE=:Ӣ  .+vP<&l s(Ls@W(h ` va8H0{AڃXa/ GV)fo:7vSLsUldZ%J|t9t/32Y[ql 8E 'Ӗ+91'*n[&]@YK S\Y#VK] *_Cǘt0;I$\P}Z-N,$hK\j[B,ib9WEf׌'/Ī6-L BO[N'{FVv~[u( \.3T3vV6H3Ntr &Nk_ iZZB,\bX!:m`bPE(%-!V d[H|%LjY#Vkf¯KIy,7I$Œ*N9HfT -!Iq[8%-2+$80m!VJal JJTKR$9[Y#VŦnsMyEq:rA]=N,QLesF2%-!%"BI!BT"%j0-!6fm;_={F&i8G"JO}d?+g='O=eupP%n1_W[zP9k 2asxO3`9ZS2+6^Ќ\-f"]ILX9*P|`(S|_")^t&r='c["m݇W41O92ϩ4oǛ(,^)Uh+y6cpo~ڞdQEo?U}Q41l)1abb#Ke=ԼDcy-!6^xKeXɩ Mo g5UjWN4Nќ Cp!>c$nʕDjs/ILHx3;tb{f jNbbuMlL/"BZBly봇 Vi00mY˸mXKc8 ɨ2RymÏn4@~$7 ?}*찓J3t;oB32 2h6jQ1R_*W7"N5N+v9^^T(E[&߾^;0^ЙzS=1mnEozmlߌ&em1t<,bV77Lg>hĬ ^ҍa?2{_m S`AjyE>C5E>^|Lc,C9{G9WTvϰ :FI )ģo22v4^ 4P(ӚZ V4Uj=POc+< _Gwu&c?lDڏ^_w{+}K oj^'~;&}L^t #6g׏R3&ٮkqY%8~H򥘾|YA~b˪0qEba̟%\ka ;lP&2=^Sin"B̗jpcxր|sRio9R,mr$G,v~Ԛ#4ײELfDK$:C`nVq}WN[L@/gNށwͩ軘䵬`˫'Wjh2ʨeٸT^m}\F3*Yo .˸?*/!TS-V14gi5o `v~*oowBvy2oa(CYe A1BX\˚5L׿  \]w[rۑ/%husQͰHYᮗ-tOvpvhS5,٦ .Jiro.ͦ⛅~)maSfp׺,'5߻0BRۼ?0e<)|wjM4yY Ftnð`t|4w;կ-Q?NC@,O}56?? GGUJ˺ ZJK#={VT{J[Z+\2'Tv5Q\{l.w[;]TfY (XXAx:ﱛcu;tuv0+ʘdћ]e\q;u~U:zpun>XMϕ (.? C~ͣ lxQEp->P KPmǚYs|数 q3ٌ"L:qsg_>[C A(c:fl8"JȜs(v$#Fkre ӸSuAsg;hgPu g'dMmaT0B.! $.!K2.!Kv塡Zс0-}]0mE__wpnص*j6ZkAOƿhXm6U\F ทa)(tJt;#T ߧ uHUr9#PU9*Ha jlZ$ےﮓibX\_8+bq 17o-tc"U)>9݃YLΙER8.%v1gyJ5orM葵F*HbLp&(5z:zㄏrT)SW8c bjSب)6Πm,~BuT!AȤa[i(5|HsBy/[V:kw5oWEـB4Hj@C679"Y&lT8m”cRLՂ S-G "6u|TcjZfL%S:SP.٨U U0&tct`-jw5oW_ !  Fz@FsflL@9)5YFBFR,<t'm#BOD SH vNO5iJ rS%|Tw.7ebgT1ʑ 89U! 'F 7Qiś2ĿMW!˝E<#2HzK}G;ژ<ӊ7d$cG|5iZ1xc+* 7"S58XHK`4eɦcR[m Ҋ7v<]Ͽ5•jwx+_svbr$ 'm~Ʊ6":j_{fe*Fb^ |MBsu.sguT)WCn O`YRRgm8t@41 ۪:ӊ7"踑ARE15  ί( R>Zl)ś2͎? yigv)YU8ϰٌc3O\H7x;KtU+#xS+S\T{ֲ9qW f($rpScTp;J9xSr "t^akAa$q֞ǭV,K؉6-,jJMϯmDF-I&L:b{\Dހ(*lVU.]aQN&A [q)D\eϐ<. T 7xM՜ SReGN 8UY>\©XZXm;4 %q'32+ ܑ+mt7"fFIpj*0]yM,RW5桎RqSHwxhS<9ثF׉p;rF'kƸ75`dW(򩺕>ش7SH簽|yl2D*?\[A׎ 55wkC>Ujj41ׁ=k5y=uTLZMd G1AOU- e]k9vYi|W;iQښ:'%XER$RZ`pz[E>Uj4rW$ĩ5 ADR)ɀ֋Fz(*6w|'D$7 cHp%u J^muQU+({&z a}d^'lGƈd`0V]FbrRtAڮ=nqÐOUsY6L<`n$ct<2'+( rOVFyh=Xh=_5O}k|PC>O +z?zEkWE]k|ˊ0ލP^iE+!Ʈ5eEWt J^RWÿ]A ^l_+7z?z/+׮$={Q˙ S .l,z~BcEuopǥm EHgC\"9789 H߸H@G QKp\% vPCJ݂²v+Np<wTfwwB;X6C;o3 1:w8X fl`4]٣]V^8U=FVz({XŹqob(`Rjkt"ƽ[uɐ `Rz0uQQ #o'̘(Tj˵UGg}ߝ6ԧ!|`54s,oKYUj{CwiW)+N8QG#}T vKQ]Ƣ绱EQ h[ FȤPzP\½~9됄A]e5|zݏ:qp^J$(+Њi ^q[݈"rJfy ~ОiEkȃ ,h E>UfNZx5g+]a GS_;EAM(?$xެG^p'yFiT7wpl7@IPk "? D:P^Dr _1fpao=٥%Bg+j*éjanKT sw,H՟zDR)^RE0K2 :Eb'-,yW gi#!7{^{CgAa>I(a'^yb(yDO&\*E(I^H<CeMAǁHnhE>UfI@\̞|/snlv?u] ߰`f}E+1"SY9 )쀘< ̾`tL!TYH\ڔxZLZ I.T՚US$hns8lV< 1b'E˟67 ujztOp5%}ڪ3n|+tF21 )=rZ(BpoRa@VaTEf _Sbt5%*3N{Mq~s!䉫z)0|y oRxR"_͠e}4K4ԙ{ D}QEl n$]PGŀ 7Wo{3"^x)\u҃&l]'Rx8xfv<]Dk6?qW9Fv@FɅZzS"),HIJFΛ:xBOUjƹr`:aU59ly_UY+SFD-X{FT通C_PRa8Λ.☆9w痙15`B9L⡃N|{cPhAv .O%jYeuu3Nʸڒd}S'B̤ך '*5QF(G&QUng2F7zJ`0ąbLIgcSjлK.C=~k*Bd\|}G kdqF͗}л*DŽxdțFd1̀Ig_ۈi2I91xOd(2صʏ2:nn7|3Sã>@cRҌwPsE*#:xBg C\ NSXy袎C>g3.m~G102蠷y\bȧ\w!ޢN{ao+=?r aQ] SR(=qϭ~AN:bl5@2+PU{y}? d["0Ⱦ^0hB|w1ky{\'*Udč{b j6Fэȧ\~hTE>U2#Ƌ_,}X&R' Wtc#\!uF9?YgE͈X xk0_RAvL1Ke@FT̘Ysd{$[O>R"|5=SOuT:KXlYMuभ8wM€1<:*ǑOUyvaxz< e̚~K|e_CkM R|.T=ѐ}I3Jr?Q@2 FЌ]0:FK6UFAx0mLPDW`n:P _ <Qլi>6vbĦ+%gSX-/ 8bQS2’8ٔZDx+ ~m EE>UrW?"g@FCUJ#, 񀌂X\US2/4(#Mؤ,5`ZYMC:EiMd''q19C>Qsr3ݧg2z켴am1eE>uA2+z)1*w?QE%QPf-Lv}8,jj4puXI MNk(u2y\0HbjE>Qr" t n͗ޑ3^ca0E y3yx2Qp7f&CPAY׷$lfkNiQPs>`frPSUReFҟ7e ((kU/8V^ Y]v }Gy]#uɿ`-~H` %{-"cWOPtٯ_{rƒ-ˌ!XUM湪l"k,kL5Ѽhy3J6# .gU2-"r;ĕ[m0 `m,757 .Is ~/}E=ypƐOUsAk=zG)%_v|S`>x@Fɮ1zxyٌ9YrG&A((vDNRJXR\(y4S2Aȧ\Q^BۆeLל5[{Ԙ=l˽ଘa;Q Ba֔tߡ]Qi3B/s:8'Z> ԉy>>̓H)|n.Qu 5*x,]E:&>,X~aDNTp !U18+y2ڪnU|-/&y_HL#2VMMF(>kp]mHUl/f)sht@f!$#<VK \/UbO6ZsB#󢕜qZ<,1"3Enq7כ_ pŽ`#9>g*ŵ;cܒN`Vjנ[͆StA :^0`ȧj3gW3~-#ϊq5%B4ј9uױR2 p`>1UO^!ɟx7 6XDžXwE1,q*)k}3{G<07Bᯃh( $ ῃh J9\cg0b>]Sď~QӖG0`.x{NlŚ6&Zw⦩0p*KYmmS@`iU?eLm\Ezw+0rp uS5e޽QkWFS>n-PG^qi3&t_gx_@VR$M b=xb_ɩHz%|D9Zۯp1OON[[}H_fӘ: 7WJL+z4ƭA%sQn~Vogԥ3 ߸ mi6 _6-/rP=| pj [|o,{5R5;a >4(!VEeG) #`Bc d19Hl̈C vVOuwdLƾI*cn4">= kcE :xQE?^[WpN];(m?_(JWTleP2J{FhRkJ A=<=Rǰs6i09t6\mXP qB}|< GCYh5qiHPsW#N/vȴTB3|MP|\?>yVfԴPEΊ8ɲ-6|PL?wט>/tPsC@,,8d _"-զPCOzrjXqްx b 7Aؠ ײ׫Kd:瑈{uAŪV(_rnQ;_^ Qw61Tq }ެe(<> WȺ& R|'|W(#g6E #PA9cQF1167Mt|n,sq r)8{Qa'+= T8O@A0o j% ʔ"Ǔ1p07H1{E*agꠓ.ysYyxs 7븣۠"{8 Qo:To:[iHLU'9'8;954u'G1NP'ٽq^yqbE.al48{m~ ՟s\ۻqfփqIJsݨ\Նd OP4)dP@}[a+&?=?I3ĿNs 0H}m{| 2(qPc$@WG 2,qX{GҌǠrk4*jgYc a+o t;@Ľ@Q{3o/;8Ez%KqbBt7+X e Wep m ȰD9NLػ#ML RABOwi]8WZlq9/KD,[`D ~;DAPD1Ad_ANQ_[I `Dk1FAT Qth$w Vpiu^qG*W)qM )0 E!KBsV.^M|MPoہ "nFQ6948*c1_><mB6b1P[iGЖhdx ⓮-Hg t|eTd(-4)x[V"MC  rrf2;.(}*F dEj q6EBFkqv2 |jɈHu[7pŨtdD > 8e5C%\0+R]"ia'O% Zn.92sJ3/`6YLn2;_k@8<5\|q&ᘡxg+DM"D!Jz:H!NmcFR t!o"ܨQ0EU>y`B7~\4 ,_7 =pd;<=@|Rlo@.̋erLcOTd%kF")C;KBU%ZOMen'.B{J04@}W!yc1a9h&č9æO>/ipHAv((qnn5uQ"N ({&6t ]!"9 gc4g6gL)LQќ2y7yݝqwXc|ÿs$ uq+ 7l BZ*OɨWDq8WYk J!8ոՒ NJ΢Y{1k)6A"e} /~w1涯yMs,k('_)K8^%TШ8BHI m4is #79)֊P eԌ"l447!nr7B j1Ƹ(9c!.vW-n;1|m ȞVCw֎svaE3譺!܎FPjlƢ쑎'#cޘTtPm ~ɧ0ѸЊǓ4x\ŸE"(YR#q|`etz!ߦCiKodXEL*5G3b#ʽۋ+q 5!}_%2xF:3n&YIX4O큖( b@Swths'&Aw2&[8^grp9ZBMFF+[:B]EgJ;||~i'u1ƫ R^xw_Sm ^o ^B{>.ۅ B2jڦ< Y) 3>h ,~†zk$F 1Fم<וj)y6CͷO~#{W(w^aU>Tm00]>MI4e$=)h_tf,2ymu*3<{9xs;ɧhjD鼒h_uAI>#/\ͩJBbS1%u Moǔ+c0(JM{؂ͧc_JoQ,232 q]+qI(w*S/k?Bkw1ʞc8s׊, waC}Ov1](X<ךM=^f+1F٣Fdek32$j qC2kS@) /+ޣ *_[B ݣcQIB&uuBTQݟBXhNKW(5 wMo~]\ͼX}Nོat*~#ڇ׉u_'TJâS%|kd:yj.jOG?M׷>/7wL_5?3_lfv]>_ y)a(y+nH^4mjݴ|uYx|LV,Ӵn7?LRO|9S6 @m?,w|= bMd0?L{9')LliUu{<}]̧)I\ϋNxpL} qjWǻ_=SDP]Fba~VSg-(o[e)LS@N}'{ǥSdZk-}]O -IW=CC#^yHI.0OWe\ej\ S%WtUxXm /u!{R )6;$YIdu\ NYy@̩Ư~7)mtӭa=iu?IlD^hŲoqo矿<}[> T?!ESD`(N{Vw#ի _޽2ЬhN ?U۹PM)*'uH7Db6ʹ]kNe'EzY<&'rayAo?UegAj?=Y uZDb9oWtXθ%1d> Hrj=1*:c$񒐒xpnB R`(LTvO/#թCm;G!:`W9YJGׇY5Ϋw`c;s(prN>= u$n1ATg|9w h6.3Vr!qY dWZl d.<ǽqy\d $U8\ދ/M=c !2ݣBl|~Wv1 lP{(v>E*}=_8iDsT> .2.2`=>6f_J kw-(kA mcmd{$I]鲧! }/.1#Ekcq[썲ܖ׶SM7(&3vEԳs)ǒ08ψrњRLB ajdYI KT4= *Ӥo l1E/][[!ĽL)Dƺӆ$L\Ck1'XWz4!7[(#ȍ5U;ˏ܆:{L\F[m_g,%r+[G@^\5ݒrX uKx:{n1(Ei F-(o^2]:or~Fǯٚ/_VC7>(?L%G_^wmCn1F9/~x<XǼszq{4 {G5ӿk-~-bpy-;2v /݂J35d]\zRI݋/|;6f~m 1vlv[/Vh''&˾9b5frϹ%FޥHs_"QPEx $[ $7mKL{1c5aPÃʕ} ubrK巾Y͗:CEzt\=j]`C=[q`ΖBf?>\P!7Qgo-(R%oWO;&WJ-AC=a:O3Ypyc!.Ekc; $[S\ӺTqn(-mJuW+H]`G7d6mB|]#f&c `8C @$ZVpuHiݬNX\κVSǑԆ!}liJD6p* ޞu5@r1e""ֹSZ RK^4RQD8g] }6!8FhtQzNj׉$qwmY;z? ;I 2ɗA=-)R!)9v>&R)xA[ynխ,8t]ݛ#i-ہ.4ds•'>Uv̕#so7]B8S.xKkNX?W]<)N׍v'_r=YǨ^!̐gw໱G&7d#;}FCH`^W)dY$4ĭݝեqv]ny]\s׿#Kbex]foFq(xhGd9J3g*K7rxTzTz977]ܺt/w^4./ZNJ/:\6wH|E~rMp]fkM1a2ԐZջAmZ~w$}2wtqEԘG_oq"rWf:z"aܸz^hF7/1^_1?S-d50]qTQM[Vu;5~kZִ̈:)9t N[?n>=Ju9eco(5Eqdµ`[xփAzHx*QoԲ8DFP = /QШ J` Q+^޴]whC9+ann~; Vw!9#ӫgV,w,[^p jlMm7q#I7Vaϼ3?De;.%61e7^b#5Ya#d:bj%hLoӏ_^V jx/ H?G%t2bݞ#~^JlMڳɷ2ҋr zN'SzI8_w)M7``RgqIu JQǷ<fߓW'$s_FRVnjլ^|QTznl5BhuoR-r @'u\KVߥG.*HsjdYǑȀۯGNyUY;~$cFj Ywdg8ϦyJy8Dk^'׫ȧԧ>{}'nr҉`UŌpWyV^ⅭZ{ix 1ѽw馼n\^@l5_\qΑ_W?$1doAmQS׽| E Q'"o/_qCӳq3L|nL`ꖫk|r'WM\ٲ~ j95L޲xI|Wvȹ ,b(Qݸ^^5O̅Qoƽ`#ͷn(|-XW߮ /h{Wgqf?GգŖM-E51g7淳z6D+>#e)x2ܽۅk_rOl<\ yS }+Ax>ء>乮|&=Iهty.fVb{1;ɬ-II@A8n 9+rD%]V:*Pס֛ʂ* 4ݐqG*Jn8]v*w}Ppe-CfCŝj]Xh{Ru[e#-+7_n˳f4n%ԁç@N{"V ,ROF"Us@hСqkxqOˋOMm,\;^u R4/'8fVƋ*u^%Ȼ/(NS ¸w {AJ])!.(OI)Ve_@ATZ,F&TCMuH6$ujܹ&&v@nc">`Z!U!;>\qTpw[E<^cqX\-}M{ύ/wݝin{C7-Vܔtnj6ݘ}MV^ZoJET4/Eyֻ~~작Vgw1cԒ'#2s,NVŷ)^J FJ2sp& 1=\>zCr0o}v}`psCVsiBro7g t ܰr -ZX g T="t1WXr L,tyE?` PC`r+|`rWp [9X;Kh8YWڌ6F&i^׺C!R_~/eZ}3MWi"ŶiϺU)A,?cmY?ou~}I]^N՛ v7_c[irov/֊[U&/ضGmeZo+wdpC ,[܏=${Wvw@uk]%fۨB7roؙ}bO[ˏznJx4"- _IBΥ~ȵݡ5.ݪo7>ۆ\ˎB,*ZLt Mu W+/# *-#FiTlO\D- j'zMBEaI,D2GHE$* Ȏ6fSx-~jP-2W!FS& *L9l/j eV1r2iCX+JOjuJ.%h(3xCmJR>&0huZ "dl9tԪ|Nh42 +اs=-V%x5 "\hK Axl<2l< ZE+'(5.7D `$vfsE\G(2lk-"Vb +g)!0DGmwIK4#n[x$(#N#^z| Dm)}+Gl!xDR;GeMZ+;5\e|OH&,W50@ ʃJU,DgNc`4dT *6bˮY{q֐fYG-X৏>T_@s(`VbJB41cT% c"P4 Ѐ`:fDRNQq"ˎF, h >oV#C Ju) F]afIZ#ƫ11٤RJځ L' ,eMݚ-=DbˌYIQ2FM`N"O?+1d9'Bu#lL|Pi].dwmr-*\ksdP`l,, ÓO (~-J7J:í.8q-ĒUZPt(;ۉɗXz@zw> DLXͅL)ζ+4dԯ(= ɋq^; yY u[ x nFdP6:c"ӽ"TvC?IB~ K $  d A /KA@It'Bۉe>mn|:y dHIm|&P 3+SlD01aw@T;oe(׀9H"PBޅ\K[o2 |˞gR"D$e@Gz0F{CF+=l9B- )@G@!d+ h3VC+Tg/$r|`iHٵK}^AIc纝XDID'( Lzd@BPp;H("r=B,2fzax#H=ԁri$$cu:l"H5s tXZfH{,ؙ qD#kQdfMDjW*bʲIY湕X@K+zMw a$R' G|#zw%qJW z2CoosV9Re<hIU^.6rQLC*太iäaz ny|sI _~u&w{r7:Wī4ueq릊km&{*bvFIwT01(Y:Ze#o:K=*G#63Sy[ owӫ7[psw8p(ýNB%B5G<{'4'}D衄>+wa@;p (ytVZAqf\4\*D}Z uT 'Qs]0"ɔw.&'g.@N|[%5vg۷o0n~up1趎N0]<{oLu\η9BVdxQB;E{ț>of\(\Ek8\:Lפ 2n+x2Ž x)W%„|90]_L ֙_Et"">Mi&` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` /|Sx4PdD]$2@` CxP&JjqCj/Lv%Ѭ0a^ 5Ь0[Gy$٠YiV.4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4A4AEiVfe 8] D\;S{Hd9V~zEjtCj4o-K[a\+ԋ[a|+RV Ճw^ŹJ4K ח(AV . 2n(;?5r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/7r/hhVF[Z{14+c)4+RA;Bp7h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:3h:tˆw5~>=|v}d{w04^H ,"oM9]c7RkM6G΄"ە~xWO<ϗLk}x4 %'ƛa&;e wOY[1z#Wku)=ye::?^* {}qQ-q"]K0UBHI z*{%ƾ^e>E]*-rVR[ ^u05X; g cιWdW 坏} B_XRw4sq08I?CW"f%HBzI}% AVN{&4;wͽX atX|LjCSa?Y~Xb2{ꈶ)$!B%RԉL Yz c}VV:e#\7!a!rb uT,P*TT_/.ZL)u RtƩVQ:b`8#<&0w a8~'+WRFUSQ?x_8HlNkHRsTʣ:ew|Ç}lR#,I"s9ñ{4F#F r >G̦@-A pog5 k'W:E`d+7Eݒ}qwoه[(YA IB2I+$% L 觅\2Grk|#io=5aԦ38','Rda{SwE7?O'=B19CH0$T)h\9-MڶE[* BC*EF~dtG/WsC;8hhOTqʦ"JY٬%eN&d:}eN"bp@@2DCyU3V,;骖1$ %[JlxQ \\Pjw@[րf5,[ 8BueM  |?n_aɩ?} sИy񘀾Hz8,结ɮAs3++F\"xK|)h}Y3ng`sGd\Mk#"J$,G8È܌8.YMQͨ}Д>ߛKu\X2U&ͥ".~pN9~EkBb0 jWf74kF ]~p <)k-%$j%>D pZ 񄌾(74;!383×mʾ?XgמLrDj׵n5#mh^\ըGНX3tw1v!CJ{+I֖*;MxnZ2 ☑QEԦT &#StF02ETc])ǖςX>Ó1!v˄j8!R\6E*MKQdBF_pȫ?<)ih==Z{Dhli$N'JP.>s}.]3$b6ITi'ݏsi~xx ;T. QqNMLɸ D2!38B/oNWlO*^:c% Qb`n8/6$++Њii"XR.ČHE~7lgǏ@5yC>5y͔P)Ӟi\MmZ:hgIΨi)3-2éYqN9愌,д_:b$m| ?&3,< %c=KQLā%h/rAa˗q_36ٛ Y>#(,^||6#ݻ Kkn RX+)>fLt괉0!38zy'bF q 5an?!>u$O#ǒ,QgC1QZ;Ҷɑ=5-缯.sB7uO lDsH е8X$ƒ/>09<=?foBc!ߥ&TKiI,igo`L({fIy>#H7+a,8w^Z] OnS}rZdK&Lݪ o: t*'*%=Sl ϥ-s*qO!=jn-zv|3,!<Ԏ$)FwX2qCqҩ ƿ-8!ʀ:DXoqۤo%RoB-٩v%ӏ,~;}h15KVevXKbl XďNE/ 8tM|*g8^گ+Z8as?_[ 7z_2A\wBf$!u2i,M>~я.wI%7(&()@6PJ܇Ҟ^~UHD0P D~jS0U١2T be҄,DIN|. ,ve Ss{ErmE[19' LUQ`Fb5+p(`qkpb}dE8JRV) Y/WcyVD ҃Y3v@M Hi߿+<&ZeHINrDeV6HXG_j\5Y67"Uv^~hFZ}5$s̴[9<=3 O9A"=UV4 S X8jf}'p&W4Aq'T6{4(\ODzfWNwEm>]QcI]g?\zMƩuIzC#Y0[_Y!,zQ:052|=1?ˍ .UcOKBX1=ĺuPTa"S+v%uxYꆋoBI6 b0X榲i/KTZgkث]O=9 =.LN}s}_o7Z͵7P5ux'$|wkncwM}lc7(+q1s@|peX*G(z~pH0W[ Ca=IH*\ʥb'ÈD.LjaJGt{=/@)#Dʩ=DZq) ["G0\/.8 =imknvkg^@?@{N"Og |]/}h5'PҋxH(q ETFsT&4tu^R = & 9LIEJxu}j^T_+|AYEXyIx#PWa BD  #CC 9"m]|:G7aDVFt2+x $F8XbDv]_ʑO&8]LbUSD H!*L40.#_R06Ͳk;Zw{Bh`Q  - q@G#]>#'JJy/έ`\#)q(]0İa|à Pİzn{u VGKNc:>avޣ`j4;͝1x`" Hot,;CHqGoY[-) DK`NJB|`Xά?(Ca)aVAep-uwvckYm-ceH90anuvđ/\g>՗AN|Wyq2PvԬ ױlҺ;ǥ:'ֶ 7.NLhJJa?}g7^'}V'ڿ0d.We&n:Hbb򳙓TO+<״kE6Pk6Ck.7[$hmՠK1:UKd❆+R2LuS_^֬LB1-ey;:be\W*1:,艾GS`NX*۸a!9'؛YNè 硳cMpϚvJI%G>U̴)bŸ*g0a(<~`^,"FwqިL<̉dY_?JbXsna0ԙ#TT:VQ/4wLlY\jo`azVkJ[ Kgt;/3}b5MnW -_;WY;*~^VA HOz18Ky B#bL"0ŔKaa:*SYl3 !vvnTld44+cAUۋzbbEҲ[3栽܀2 JiJj 2ŷ9 TX& },Cl'zKz\M(9/ Z;&e4Ǒߍj7Nu(){fp®K=oTq1\wmldi׋r*Q0H87ejRX].}kn~`qa77r|^a)L82/^חe*彬wK/f aʯ;7;r@0cfS6[%:zki>;wM {3eyu)No އMr7W>;dJV9:v[Z3@RêmbSJ-aԵ;;a,{Y|ԡ%;q28kͦPmlZCG|q`&4-j9I7>ۨcScP\^I?J3ATfՇ8|'4X 5Kfe'NW}~ed:#ٶȶ'3 E.y2Σ=z=wh##srޑ3 <3WpmfNb<=l<O, &Y ta9G]'ߜpU !PF~\0}G8B20) b4 ?{ya-ᄵ8k)?>֬)P+SRwsxi_<^>M̝|+׍hAք%l4n_q֛{F3/I|Q `4Tܩ#SH uo+/ǺQ'aҘ}=oRwYT+,Uiͧ׀_ k6;z浙nv_*H'Nnk:=R,>P9oN'|G9 ^ C*7OitQil}i,sbH\iÈ+n( ŕ O(>@ &ޏSW;?ؒੋ+ı W~LPZtLɉ ,"%Sa?pB,Mȵ1g¶9:<`q'ؐ4 ^>+ʃ\\e;=?2vm_Z98S\zt>4K=pANpFy c l")Ɓ}Gqı H>!d!]8$ynRhr$P+8#=Gac[PBoY*&LmYML40g;7wK@#uiM[8pB<Z#⁢df}ICyL'jhNTuz賻a&+I> 0Jxva֨LJ墜qljI&1`޸]Vтu] sG+Sސ rd8@a[bu%EG f,5;%;==ɨ0?dWqyȺU0FWrՋla.85F0c?Dh<%V5$wo{4xߟ?X2)>sVxn._u /Z+-p4RJܮޜSQ7IWMkYHy`W lM^{§B>ҸʁhpE†;}Qё(5eϛ {!w>TzJu5|cV]y[Oƫ9s iMij #JB9 < f천aӜ~^@6$Unn")͝v{uB1-.;?!.cC %!4]g{@͗,[]HQץ劸GyMKY(ǥMƹ1$?N 5{a8C;cy6tstlW媣ZRֱY_B"Q(,heȥxO%Z\ڸlv&ܤNUP} Gse&gI8J Q)=?Y6"#XKrJtrOmYE/=W=6 cĜ)vOz ߡIOY]qggm孟T뤧/pғDKz(l3JSHm od-"G {0;mmW&@[nrr~ W_PB?-/{l;hLj0֟s= +јXBHmG\l`  w۞z;/~FkؽoldXuT ʑ!Iu2ʌmBB _ 2N+&J>ܤ$,҉G䱋Z`n2J7X%S2`X#-Di.b0LdM%/qi'LKˡ䘅0$FBUK*:!@NIjaA4 [H(\I(CԸ챆 M0p4L4| E3zD.\;d%:&sSgٮjU0ΘLw|WXthX5~#!PCVX1KO-\'$RňGVp"cNo#DSyHԚ f]]SAs +XP<#O! T*@ib d$0Chm$/)]($RpsY",Ɔo#x RTI @o)8=T[dѓkBBi)SSm^DBk#xK| V,1JVā@#G$C.5Mm#xmnFL7QF׿5{3u埼&ʹykfZ.=o5+5j=noK7S؈;Xߕp%涇zc5\}2ƄC ޙp%殄-=\e-p8_/E wvߙlq+.:mwzql/#{}q87 FPr.\쐳"_'=v@?L-kgމSp摝X8(I:ZJ-֒y ƌK!p00@:mc6Sfodzur}Qn\$8!;12HL|F6Mʕ$ϰ7D"5ex2Σ,4&=_kߣ'Th!/[/xZ{.7e؞W:bhn{)BvNB81x O's OUޢ'͝@ .˅[WGoƝ^mkwǣl0 ;8 jGMOv~_v_>y'q:y}^hpw{Mr>?Ϟ>z/3iKvz~v c|^!>|WF-;f}v%nr8]x!-+3_x2?_otc73<~#^;9;ONS\!:GգGx*8^G5ׁ gKK۷vz{m`~i>5]T^_:_t󥾁fNwON7_}qH7]}fvh0o7C[=пBS.w DcgLFǟ('?,\dEMί=$wRy7tM굧ATEh顽nNL T&gPof+z?ξ: 9ďuǴFo{7:LQ̖tYXesw\O렘9F6eu遻"WP> I;'q4˟9θ:礪I@5ʩ{R 7+oT/7\4ˍ78p}wrxm(p/7Vf V7 #!L9VΔ@2V^^{J[$\mX&RVr^# >%0uNș44{&bs+G{w5`v-Mb҇UL8HbR dKaM1$M0 v[@1jh0zSwidnz)`iKPp9. ~3X~m=9pӿ)Cn дMЬMVo y սvҹu U)'łUzB:DsN05i_̛ {]5Z66TGǽI27#;;;]f !jx G ?ZYXkip(;j]Ӳs?uu>=[ 6b<^]*2;UhTd<λ>ضaz]"e 9`ӽ?q8Yhqld^;v?=88F5 R,4Z(( 2Z(-\ ۟ew=mX_vP4qm4=ݽ4! 0ETIN(hEc<|CA*iwG )`ƒHxlObw?y#u7.f1iOt_SNZ+_,~We9qܝ8sqrvYE I;g듷?M̀#!4#y$i>BOG"J#-=RIM< ?K6qQ$#:&XJ1Q$1$@'2\)۫7W5Hk@HKUH1 UĔjH)ǘ qH"s?x)ޞ\Ź647o;5BRIQiT DH#CG`臧'o~:9cb~ @HSPH2TcZqI13I(Z)gG3yhYAb fэ@yG"z{q&%!F3e} #zBr 'ZD4G}F^ѕv@3N&c Ô SuDn}?DA-B 2N/qa-FbaED(Phulp@4q إ,(Z)R_PܭW(׉xuk/ xN&3Fk߿X>_Gf$^WBO@_O.8-GT*bI|?u/.թiHi| CaƊe(9`"Q¤{PF }ǗË룃guh" c@[QcK>f_"BBU_/ή:u\u/:?_6EYg=|svut'?"y3P@d%@`p4^+.F:b`U"\"ꠎOC_000.@?vbd9"(➜&q>hN}&fhϹ@@bMBXӠ'1`3S0gDJქRg3$\OdxJɼAnFhNz J%8t1FMlʗ_ UqU"f_ g,L#lح ]ΓNt8,lf( n㸴$2svׅRH T6R@=[L-bZaf~v[$[ɻ 1GWNѦe{_GY s6ߕ,+#&*g)?m};L R/͏a0>N m~mo(6+]a~TٴmLK0C{ [>)=ڟ#CLں+8NP`ЃI-{$)pV>r]DeyԸBwsŔe43_mY:܋3Nm?sxy'oiJZe\6WORIֺj)I}(L{S,I99o.3ٸTk&l4w1Ir&Biw^~#>=Ͷ\%;$pЪ8oC#'`IlӾoR-d˜Y2 wT2ʞ^,)~i|mYeڥ tYk-uOsP񛗶Vnqw6tFug0`JE8Z墖&G߷ u)酄>@ΉezijKg#̧ar>*e^\h"SREsD(hlwpks|bv)HL3hɶG翘<#قIOr[vWZv +PlAtNR:j~[Rܿq/Ƿ|jp{31}dnYR^~cd.V2^G3fd/&(GV@ f1 [)Vj(fɠFXX^rӃ\BjN+e>5d,t5K#$ݐҍ=f6yɞMTkZ&XiTnK*TN Ù2{ڝ{a߆w[_ӟ-얈iseV㠃'*\Te ˚H*-abM}x rWF݁-|.Tĕn1OD k);I"y7W2jS\0ڜ<|d&@2T,OSKR+ )m!HEQO!ycs!c h#,<ڒmh){bqH%L-P{A; *v@f005U>shw+"EC.<I$QY(^ soa>5g W <r ʃb1߱ft Oc91L}TTN̙(];11nP\U&Vc0.#xf@5Sv4/0JZƺMTna*({*'SmkUNa+S[դ >m3mYt=>ty/3Ÿu6\G.9*>MJ' ^K,U+jFՠ29[}ra^k|̗1r !KɁ5eA"RҴE)zY g)\e Wv ^vnM0񡻓%BK"&)Fԝ:\WR 3tcO{.%HgԍGf9D.%A5TIJzֲ=4Φ0d* K #:PC{|d[h0Rgh' {{i$y)\9_q7¥IϭpOLHo'}OvڥI J=d8"  [2%h>'NN^pvu^e.$.)g 4XeϦ3%mY^M'X7ӌzRd rٝ${iGM:i73)6鉓B܎Iir[.Ơijvlfljo[tO|d3YsflF9QflFFNmKl1i l1x &&011i l1i l1i lfD&"Ƴi Wzlm TDxY(6%|6%M`M`S"ؔ/-nS"XNdS"ؔؔ6%M(n̵SOk`::fqR7$^r<~Ӓ)t7ӓٛfI=Tc]GC^Y y.MV6QAXɒW|dF5I;0gY%KGJ|ƛRmR)oJJ=V )ٔz/0ApeԈ+dS :\)=jGE 2ư+#w@JlW>&`577Q$(AB{~22&Qcϑ˙$UI]•G]iB?`w~OANh&q7iݑ[f/0|tTz?}`1" ˱>s%ȕXjHvrԮV(펷:>J6m/p LM2-g9ޔTc̜da02 K|NձJC5Lm]]xU &G{&fpweCWQ} )X9&F-gKF*]p`IUHo{c9{rcghB&fic1tsuyܼ-NeV@w?o}+m0].j8],FŒ&E6_UܓN ((b3ǤD:A+7jk Uʪ $APaˆx)*E'p[ӠknPp3R aT\nj+stܭ(!eZ0<|? pK`,qcpi  %-=Xp}91"(n\{+\i}pRsˁ+0_`{2R +$&Ape11p>u{2R W/\!$cj \qh \]+#o^"\qN$231pe~SH+vŃ-\9Ǎ 0]qi \ipe%•M d0ڍaWFZ+# /@`֠A#.M+# )˄+ewnmۄ8BHXJZhyۡ25K[$͙5ws z>z^} IqZo{ٹғd4}Uag$w=!Kaz.g#R<=v<+/0Ə{@lqf$<D/H*My㺖þ>~}G9w=;U=]ן-ɺ>}wڻx;w̙3ٽʸL!}Mpi*)`?4J^+x }ZD+/Jc6U[gCLq=h9aXv91If)r|5"y n]2}y_H=TEGxu"sփmϏHKRM{H4dқE`M(L0K«w<}p67A :؂!¶u ibFVrxyICV ~ J %Kl([Esoi@HF a)̀iS^ >[:ygȼ)/;Dt,w`]rAZZD\Fq RB&<ӣbc#,Jrk棸Wgƒ|PRjC]XY S:0WKO~qץӏ9W9"Ö*fJHQpJ:`"&[Vi? ",϶_ ?bK_-~d!q_ɏYlݭӖ_PgF&ɚIr"Ra:*r&?.#q"‘4QB#iƆ scUBMB404O5 L9 y--H$#NM'7m`jDU,, O1A0"kΞ.H܊vO@-ea+T'ɨl*I94aME>\.ڊ[Z6 (SJeW)Gnt0Lx$ DXJ*bF G6VrLe9,ENXJң ӃzF;% >W2'(E3I R NCD+҄"'/oQ QׁL m`ab?TB #)wI>w Xz#"XbB:aa hZX%;Ȩʐ8aAh~mlA+aϘ.Ngo__MϞ]D8*+zW,>+[|W}/^T"®#m:_VbiXv/ /^޺B.*B\BOg:[9֩D/CeLl #UQճgٓ9KM9^+!LΘ]~.#A$#)8Po2HiWbe+gm&!xbvN=qAP-F\Eq,t2"1qUf)!3Lɀ#*&R$q(4ϕ3cTwModWNo},R%nz.kY:Rq/~tK.i琗!Ū/TP>W`=dٗȋj='=WNI(}X?jFDU=>1#+3]MR/%x K ]!\]+D[OW2tbZJA+l3tpqF]!ZjNW:HW\8>B:CWXW 2NWtEg=\B9CW8 "Z+NWRHWP•Bt(vjOOWBWxd•]!ZNWR:HWY*w@-~f׳): .Il14Mcf m%a"m+q+>ۀ ?$:0/o>M$߃ ,I>&Z/0sduHOb ſ:9ƌ2aҷRh#0ݰ导sr|QhFM9;f\< ں,J/vKƉ#^߻؎bzm+~ls/P޳R"[S:1SpYlx# KȝeܡOKYY>Ĺ[`kRNc˝ED!vR[mR`=4v:(DЛخx/./yJ[wGN3C-Nǩ0'9ϓyX_lҕ7n]f6װ(k/@{%o}%t:дZS:g8mn :UtNUNI8NZsh4c+nmdCo6.?F{Mi"ݤKb~Sq &ot0*wwV2Fq!MGC Сك|ONiP'؟X싒ũ2]joZ~3DsR 9yb;-|i`Y`ZXҌz}' '(u~zS: Zun6oצ ?ւR X\M~|D? p락 A+Ri._B_7RDR&oY)sK݂kg.s_LiM}'Qc] J%ւƒ`nv7-$es7̈oRq*=doW/`2ЋdaA X2c1+1VfPx/F( {Xsgq&Gֿ(5a}SQL;CW^O^|:HWP[Oo@q I-iξlD楺$ӑ)/whfv?蘱zWگ)'z=Z~F8|?'o|{ uMAܶ9kso}22{F{{1- e[Otccw7[7nZj 75V,xw*&!Z Jֿ$ɈVerֆ;*ɟ¥eo|,^e>^GP!={-fFsYG_ CYq -o; XijjfiZS5e/ 4m{>?os 3Ɲ+˙+thk;]!JI{ ]aÉ3tp3thl;]!Jmz ]a~R0-s5RW ZvȖtA%+L9suF]ZZOWsUJFr0%BԺBWҕ&p6+k+th-mdUøK{WXkg  ]ZND Q-z]YJp)D^ uf2zu(vB_OWCW\"CcOG̐!W"2Rߌ 4^v5\6W+H"چ8]4HyήiS.G vJ؞:HW B4E+unf2vB\tA☿+l !B%=]ucw%upqdm+DHOW+str&D QU >lA.S npNE{W:HW|YEMQqF]!Z`PZ]u wp浝EWֶ^]JN`te9{eΥ+9q( gAD+M Q>te̕{*v6Xgw+LET|>Qn#LUX=֙S82WuaQ:t(@h dpUYۦ#Jpͳl,䪡mME]%P>;K)fBRBWQjUqCvUΨ+DU Q-dzg+.i4SA+++thۯe:IWB0r0sg rgAD+t QJUJ X\RWX0g j ]!ZNW[]+%a.+,3tpݡ+DkD Q~懲tVN+,3tpvfm+DihOW+#!Bʝv+thm+;te8DWء`*gmNҕִf=T Si)ff"j ۊMӪ\pG%Dt5m*4iTӃB#B:DWrg Ҧ_>䢧I]`=6}|O5 Q^]uVBZBWdtA[njp3 խWWҘ:HWR)]RWXkg  ]ZNT Q2UJ)#[_ R5Bִ%=]uRߛ~>8hv}=7'A#)_f5[ >Mg0Mn` `Rw>|1|D/6`O( ! O7dCnHs;^l+?%J.WiafZVX[ju>wq9XMBs0j 1c ? e%&F?Im] fK"͍x{#dk.^MQ`iѩu0,S_7gŬs*sGMu~=jʕ߫V;qL"X9Jȝe\1,gj{gxbent"[>v+Y;QvR$mR`=4l`-VCob=Z6^N:ˊrr*,ѣޮz]~Ѷ%԰z~i6jTui&8v>BHZܳ# d,0|UtA 5ס-Yt+)S:u^*0Nx's᮸=~?l>Jِ W{\z_(pGj1N7plrf8ɖsM[ƻ~wJi@n+}7@ǓYd~9rz1Y}u.ݓuEk}Q否LS;"rڤ6/(m‘BGSKr#J1o[R+TU׹2[uy=87^o?rwv?'nZe9yPfȷ38ߎV؎ԷXf4}MЋdM8{dIOpçEěf J]34{Sc$*DMHS˥.D0 ,,5-~|1{kű#֓ q8I^jls O0EN0N2M5Eƒ%SfO4E"i=s=/Jr# ?32r%c+Oʶg>JN.f>=k0X>{;ueBo!JD"]Y-w VB#ZKNW)JUPǻY'I-iξl?G!@e;)/wl!sI(?옹ѯx|Pam:N|?]?` ƜQ?bK>{Snk,͝2Ԭ':clﱵx LuRϱvƲO'#(7m|[CADP0py~RЩ6~9ꏺ h>uu(m(JiEXި++r~*,3TWZ3&yԕG]E]L (,UX4?Ah~TuEF-/>ʊ&5`_ZM0 -}xPS|5ͣ~Q쫮F]E]U@:GuEH]`EyoUᱻU@+I"Į$7*I_U@驫2Q]8~7BF]싺 hɫRӨP]%@>9Wྨ6IN]]*sTW+f0VaWtRv*dqՕ"j 69RP]iBy `î69yvPJ9+Ehcύةa ˓>RolnLcML6ߣ#vY#9ivބ(}9*a`qds8j)FOw ` ^s0fe%&B |N#ZF^n(鉝l(vD?ʕu0` E1\F7f< Io r0B fy1qą}>zhl=PkggCN% po`\Հ#?"~oEo,BuE )ޣ_u)M鿛ӸL;p\> !!hR*L_)&RRLP S|4F)&WP2:*~s6Lӛ{޽95x~\G`!(8Y A%#4YXyЏ~sZjn;BU?U0Pe/͡B3|n]u7d1Ξ{y5+$׿yVG̬;sW^7yef#~}۲>vmZz\"-ikHK{3ۄd.|Sf>ݚ/y";~u卡"Ô"$:1RKr̥,J]b q;`/ \)3nHơc^Z!˂|X- o]+ %OZ*\.v9S7uvLiyqcjrŇ.yf/"O ) Km:ݝGkoS?lVK3]sW oG}MI ?ͼ%;BAA .t?aݷkI9jl;\P-xW\*fk(V8%ݪ^\ IX}TE>h#(&~u!ÿ?߽wOP h˪ xtDnsQeŌlf8 7E Ͼ|ր[6r9~+*/~inqy}uE\k_+n\i:X,Z-@ e,{>&x37EǴou) ‘Cg{ rP> Hov׮#Wܚ+n̑׵x;YNi8tr o\R dahЪDqGG־ݛm4rjYP%A4$z6%94g{Fs֘x0`gϮZO3Tp!˨A To^ No^=LBIq%.Z:C$X:ǥϰ#.rE5'D2*$&Pa`14w)Ĥ4ҙ=̃c}їߵ BGNHm+47ku]{G~OWҏ_EWC^BuY>}5<=ӣN:ZI8(lEGt?>gCގ6;!#V4~ B+2:#drqn4jǝ9e3ט$i$\J+XP_d&-|01we`vk`h,ϺI𑉵3+JȤ70fzӑ5)O-L7}˿RI(K)A33 i!,TXKIa΅v{ZK\~i|dG~6RXS5|-Ӛk]6*$eQRʩcJ'gVb+S WBd] >N &qv2K?9,mQdb-mNFô^ƕQ|xaO)b ~] of{%PVwqW^:}5>Q"<|>x!~07BN S C`giAYRtob夿]kLB&ex3!lP3Ng "8% ҌZ+Dbx}ז-&lǚۓJ` ! =z"+B }s-HfӇ9_w֥W`#6C\ ;) m`_%]rz<$344QI#`2*O>ᣪ|: 3/9]e&:X,$xq#ոUdtc`TE{Ʊ0nmf0A/[7bOo d=uO Mx3f;;ϝJA#w>C(N޷bU>W ")&z#ju\ w>L;Hgb?hBKo}kŏIޣ!D|}c"6 UⅨ#hu8Qhh>4ӕo1da|$iHY^Fx3RaWل N7EhI^yba LouJc11$cK0V LvL%N KXe>=N*P>mj11r}`{ ? 2l=w?^hSFZi_;cu0yCDΙ#Ogi FF .Ckm2Re|P>~XgA찻vE\]쎷 Adݳp86lqm._z6]C65D(!O`*Tor\F\SϹP$\=Ü<䟧/)Nx9O9N.VEqRp*Er# :\Z|>[ NZt6:iKr%~3l)Z;a 4B]^Ewq3Tp!˨A To^ No&!H/0>@wђ>"yƒ9.}H,k"&_%$&Pa`1]hf81iS/yxLrq ݚ`qWin֖6;.n^)<:vXAI?~]Qoy-:dfbCf]V]Ͼf ;N:VӂI8(lr㠇0 $^Ŵ>7zQL0>I&)b)-p2&-W4VE=%RhfsoԿ(nNGYJ~9 pus"Q56"2bm%VJMc5{r|U% s$YNݳpUMΖ>oyi_IO%Y䥀0]<,SuS'+d-$ֺ#\els7M/^qKʣp;4BsunoT|N9"+Z)isD^3=Wnq23Kg~l4{ -Nf6zrU&Q87o4?.VpśDҞH۞[^1jƸ`ߨ#؂q.wfE5~|fmзa|fRfY:oOgQjg@%ݷ8ٰOc<]Ou?Wj>:Zz:=iI K|_zjyCneTV.+\BUzT޴s%8Ueap!Thɴ7А duyJ|)Tv׽vW׮7y5)x(SpX(V%RdT$&cNЗqrϏtn]:Lw4δ3Fi2_C-Y*I1+=k $uN!:t8:!>{Z?x|dO s* W 8L±j %((ܳrR%xAwk /,ΣY8j$(FѰfU$H'ӭydNyss_UʈZ?vyU: ;b6[yp{e6ݬn|/Ժy'8þnGV,w8=Uquu7P4 񊀎/m)tEGTlvs!YEVR5S1x Vpqe>Wɷycnyj{5Hp E p|ǀ-me1Fmof1[-zJ-VѾ?u6g7}1޷Vy]vhdz}wvlfݦ޾6=<w{ǻt{3cwGv>r?<|~X"ԋMQ u1-lZeK'[ K=0Azw޼a3o>]Dm|\>/Cdd_Elb0mA[JG\>@Q67?UC/Nk۵bK +O~=Zt?(TobKM${>]ϚO(0i+V1sV46ņ[z IvP?q\cNxUo6We%Z9ؒ 4 /Jr8{מ`|aCHhq7Ww=1TeHcR-.(ch8[鳭@*REnAqeߐUmlHibT$Sb@٤3"Edkᄸb:d[lwK,g;1HchV䫱eY#b!K pVU'c=gO;{i'ap+Nbvطw>own;/g{.6gwQ$7ri>9ݢJ~ŕ:Nc\zy;wpC|VNʳoyJ:w{ϓrsɌP"`9? 5O^Y"]w{~ZH|/Z[A%#>U˯>܌"/ROw{dRE`t\/~]\{qwu]@N7˯\,nM/*bQ˶Vxvfrasrk_3+O*\y +k/mʳԯ?\yV~poS;_V8?NklVھ1 %h:j q.Uj_+=m')Y#kinnW`_|³M:FyQh^|jOvO; >L&$=nsQkD{UH>p * ['t/Os橫k.:rjz(+Fܮ ji8O+Z->/'wL _\m]lF_dW] JIj!TdgD뚭 %qL9g۝[)myg`zi}N)RIɟB[Oi}U.މ( ᩚ칸-f+ZljeQSZ1i/Si3Iw9Z/cL(}RHJ2Bjm1˔l-J&VdSsGO٤Cj~iLʙm" ǜKO>[ c }d"g ^i-KfHi(Xu{ŏ5+ >[U딣TV-,.@RX"D> R0Kl4X4kĬ:ݨd9 b9qblM vX4DZ/ٸKyd|BPdaCW@5 2Hba<G!5T0*Bٚ|3_ Wj3y >CZ# O _s\T>pօD2i'ɐEV%} ڸ'SN"X؆b?n@ͫYQb1!Rք8' RI 1ۢ*Dm=maZwC!^%z=Ąb!$YdXO!Jbr[&f +, AbR))"(A,ETP Ua"fX0! +]A5KD Ȏ 9$I KT="La"XˈOhlOXxFRXE M%P/@tT-5X ~9T*i^(;8 QK@yUmMkMQQ LU&>as:h4̢SX-Ұ6;9.a1.wvHa+YNUg%TJX*1"V(Xpd,CdPZSh`7 vy <`"eFnTzr%Ō|B ! )Y2ul-}e2aIDy޹TeoQZ<@^a᳈#fP;pl0 ۔|Kp`R0 3͎,Q 16Ud&+dY){sv 4Z J8 92r\`50˃jv㆝*sJ"fJM` ^ZYXEH z!B)9YVIa qV K/]9$Vc+:c"&Lg; M\+-Fإ·I5beND B u@&`+e(]NYӻjVYq*[jyoSG6yD-x& /`:pĥ8 ,k>l:"s[ Ny. -*=``j6pLq '~aA c$bDNˌ+ *vQr)0|l1GƁS@^\@$6X/((nUjGM8&md P?<9Ya6Q$D*lypILhA-Gh`⢭+/, $fr`Ih ` e@Ao -38\ɒ !ݠ0̹HUaTvYT26gIXs L-R[!TPg{‚5fBT5 B)̕tzˢ EaU3ߠtf膇lٹjPDK;Pz$S!eL #FГ'$)W!z M҈ bBV$bs}EK4R&Xyƴ apa! )-JB*5xT##>q5uC h C~B_fr}p&+Doh:<$ax40e>cu:UR<>˦G04TtaA3ɼ0oΦg_NƳ+7τ,{MuʃE]%oo3eXv.B8J8w]M0/Il-uz@W^r>&]vQ-yKpHM3_ꫳX+~>3Qqm%Ž*y0fj69{Z|ZO/T4wg 4܀/+$ ֲh`l~577PYްYQ 1QLs[ gja&*QJ0hfT  }cTy}F*YNQxol@ԝIFyOtHiW&M/ѷzJ_8[xmZrᲪޒVm?A10˶tڲ [V3v4"`ߠn.:5vq=SuU0ufuѕAoS]2xTLrHxI1H.E=QWS5DLA\EoTټN3tgg;FqWnY6.$elz'9 $Lz *W]#Ws^HZ8a@B<!Tv;MNm{^M_lD]!*$QzQPEqΦ^xoGJ2^7'm-uFRulXE_sWF\.9$x݃KH;r&DmFhi-u<Ѧb~VMOTfN5}Wx$:?+z0`۬*\~wdG#NnyЅ]+OKvF.ʷ@4deɻnL8h\zߧr߷,y:.eĝ%bpG*ƷcP>V4RMnF!\)0-\m^\s90A~4k-e<,)Gb>>G1cvTC^ﳀ+*g!-ynLƿC!Ogz-W0ǧM㎛הCSA9(g'y|u7`ޱgW~ٳWgi?5ߏԕ&oF91q㤕[iMIpөK␁dqld򲺦80TgY%Ӌzޗg.t>5NO?4qҀsQ?1YQYdp+%/$ i An俧^y]/KۯR/*e}}j~wPo,ϊjPjUxn>s~;O荃Y<"o3χ7,l:vgaBg|+wVgY-K9L2ױ0Ģ%~T(%Q۠%Q>c!I^[I]lR]؏[='G/A#J(΋Ⱥşf9x l>TMR+cSwɍ'weyAVի{ ^L7߲qC>_7]?,\O~/f8g=YCtsæ., Cي!(; ; ECCBQ*[4:nK "&%'#D(mx0X{Ba0>q;њRIH$em9+D*C\ 3MHu V؊=@Zȿ"3w[mYlW>x$K| 1QX,lMAU"R@T)q֥ Q YXB[##`J8@Y˜cJ\hcꨴ1`pA$uB6, 1%}.P]Qi,K7$+]LsZ ۰-hihihi]ZZ-h1S=[a߽ .oZxa,L >" lCT^~ /4(j[X|gMƘ_|Wqt4TJ \|߰~!]{)rcGZeȽ |WW1ZC@N I &)3fRIʄ%TƉ HsoS~˯\qJy}kpIa|7e/u̇Q{O-H[] `mѽweYv6}66k6b ܀vwA@vj)+_ PUa- $) 7^ n3-`!Bn54 :IHT : xD37B轹h/iKi)mp0!BDF##FHSd K,D[HH3`/e]y5(sY!@OM̺~! (at)jc"H%9ܜꁓ7[()КsguW=mVЯl0C⟞*+p[2ˡOj)OCܩ]E,D0NX¹V184Dr8bJI1c;Sw{E#{|uNw[6dP׷Gv[2{_ mFb*ox3.IH U1^\!-~S%M'1`<β}\/ջ<ɧRLbئ=X%q̙[(.eDiB1NPSͩӒ9"0: Uk[Wg<2'_G6Zݭ{Ko"8Oo nnNn}q$ h  [f/rtr2KRƳnYW8{ha%F`hh?0"k-v5FFBӗW-]ѫaא6l7?\ȖF6|Er֦=섍'VvHw$3hPLc᣹؍է[>e/w|tS1~cꝆ[lQAGʢ0x&9x}JQu!?<Oq.Tj ^6 &9 wN x/RNhr`E=?#A ؼ[;&7v,<A ccãizUOzF:mo]o S,j+m/d%tDI_o C.Q&g,i[ăhIO@-5ACO5z*R^˼;cvVʻIg˚~HS4Kzz)JV(5h;EH&1SR8bRiL&BP٘i0Ԇ34->h 1f3X h1%6Mkkr۸5N0TYɛ6b;]J8$4  P](}pi6Ns0r;JȢ-†h%FI̢8hiZ/5_xbr*W;ܢ~,K9 ͚teu5MsijbZgQ07C#1S$ c&YX !1%q>F0%ފ[1M܇XƔ$:RJ v]uPܵl-~߾3X+\{´ƇGEPw5Dsͻ_seK/]Y;Six`,( V0bҿF:u9Cɏy4|cmE+**I nL6;w۬Du;aEaCJӳ\F$ax=p!LR|yQNm&//+1b\p3|:`M'17wpoy}uKGXu02(PFWy+, tq]b‹  i%@(i6ߚUMWqed-/!~[Od3Z>umye7ʼnW悚b=hZk {7YY߁OM&fv-N.)@Oɂtz8IV?^~TβToG!.Uȟ| Oc̱|6i*%8lHE>ג)E Մ'ǹ.o"ƒylc38;FWOКfaGd܌G @O _C|0Oi63)9+DY*qN"-uVjR_w]sf^siحum;G.z4ة,>k(~b_v E,$ZByY˅[>w`E{ {Β\B]q}2at2Yڮ8ݞ$:$ַ,oW z?g(vB\аQє7v?4 p@(,+ɩ<ң#TT$=TNghW4#|8BĘ~H;H!A_Ieg-#%Nk+ڷE,LOVh2}2LH7y̼󳖊:mY=/- sw|d8gO'AFzg- uf:Eg+͘BM=v"WsNv]|=6P)Q_%ck% 4ro?~}_// W.jnuFBPmgaOծi#]Qܴ@v$ؠK,z;Jl}5K"ml׺#4{ N_H}}ơ0c?Bu11"ъIVLt2W i`lef|<"i3FȠ%i0>5&8NAHN"aJ,FpTjs}՚`LQZZїM?\Z m^d[׃Cu~X=:~`Y8_"vZEyzɫ 8{-|1f MDP/QÅ?oZUdHAF=+IM۽.6CQEj%-%_l5ؓ!p}ۗp$Lk[ }i#7H9`xi7H9ڮsٴAaT*6Hl" R-zEmn6Y"*ĞMiZΌj"F#3@l< ti[agnLM m w_%0]ʳqW\v60Q+=:LT*ܻ/]QQmK Ҥ./d_uٿL3Y% 97Φ BETT3ib@3ۨXR;σ4)NDv0"X&9HhET3vPK,iw3MG AʿrK0âԍiznƿ]ޏ/./k1zBxqLcYz.Ηߕez/B.`0sc $ D.1GDucxq}cr95y%acA(!pEy22PB1|0{e8c0v>^v稫ZvќG2"b+3(a%4H h5ъ3("h$8*ΉȬpn,REϵE'HI~L*=|F#6Or"Tpi#'!xQỴ&Im2gu)EAݭO[eÍ2x:ZYGLrJi6>DyLam bZlU_X f$jδ ߌC3&E~\]:nXSčpc̓/}+R5gLQ1=RuFEZӂr fX.=_PMڵМۂB]7|JeK70l!jn{vrЃtῦ :`@T)4 $0ߜ!Jx->c]ŧ/ln`r޿ߍF۝/ehaT 45/ԔxՇ.⺘MKck"` 0oY_FuuIޘٻE'Wfr5U\ӫ g2roaɘnw7?׃ \X/y_ťiMnUgU-Sojl )lI׋P3}7 z.}C>].hQ<2:{ԥBvz}uD?֒+iZ||7{4u4f6[mZL9-Il.B7{Tچ}7>@ ~;,0 3g+iYCO5珔+J;Ya9s痪j\1xB"!V<`iC:<NcʠDzPeb;ty IUp;oB!gh՞6#HbaJ{{ćB#:_(bX˄6~W~..+աGbHQ-_cngIIΆ#%7ih 5(§Y{u6zr'9E9:6ȹQjEw9l)${;G+Z? Nȷܭvގ&ն?6qF{y9t(p^3|1LQmg0c1>1=kpo&cf8w5=߃iY^Pt0,+W[\0|Aи>`a=ٵ4@t3iSãYv'Tekri.l[y0=ʲ؈ZnqAkn^@lvJ䒳|0Qs+QYϨ K`JY)>;wꮼh0r4L6-/-Zyrh27LgM05.Sa.SNf|6fQ;K?;&?]OԧäcD2Y+6JiI/#"bE FрG!eLDdI^O]xfsI.n?ϸj6^YMyɳ[ zY;5nBl d@ +y9ߩɀRw3woZ˳|r,+vY2,7002B7Vo'6Poty50mP}Wa']DRۆ@h-E6|\.|:vìr˭Mn*-u-[$94_LR&N;,(GpRkwh6sMwO:W2Y5오tr?laPjz/%OSVlsp6cef#W.`G)v(m aL-lV"+FI̢8hi@aV!sN|br#e#Q݇!A1v1Iw0SC*<4^&))R  ]zz^ jEVjLS^XxLXftమ+Noc@98!`ߑ)K1@*cJF@}`J=v€mc8՚iʜ,4 5%ک)OY;$LWF24VjA܅5Ԇ{&Rm=j==y r/#0zC1ӁFyA#zAǛ0"+TǢR"Bцy`2TZSA"ap)DI8FҎAjZJ qy1l8{ :ufEpyfַs͏HZOvw]ﲵ,z]9(E~6Ә˨!TfdbdᴤڒFrp?a}V%J-zOIJUB\^^̾^^ 4i%RBnV W.V&ndVOݙRxrKt:ooxz`0Qf)y?!>$.)МXiha%Ui8[6̱X.v-SALܛH.`Wiᇱ [k,R߮}91_~$ζA2*[VF#YѥNyS5E['q]MRu{xGٯA" -[%H--LNz zzb\a^CH+az3QU$gPxKdvYUR-މ ]FUd^xt_|1a `y6G"xێMg0*T^9."% DTӈ)G,8U*vl b E0njilT`Wg4q]o}K/1`֥7/\Bjs~=뢨*6J!۠LnL )&}QI2[E79[cR ^ wE%K$*WQ~beg_o>j4/gk1 ytu L/J*=k} Q`f(rJaF_s,M)i_lI 9Zr;BM={ cǪ[h@Ai /tc)_ YP;Ί]C;[UixRgqݛeg-kql !<'r[—-m&J(=Bn_C41ۜs FVz5D )Y>/gLG eڔjX$"﵌FMHVHKDgM9nیf}_V1d.bTf:KR agS6- :=S FߚIyluG)j\|$S]'yگPl>0snW6z^6YSZɭ,껻BUt̓1ߡU2/>#{[y-Y~tMSS~t7tbrVfίi Y% 97.Js;LPo7nqv`O+MKz, gPJ7R o ק5Ix|~F<; ӿE2iI(>g]լ{،B|[Yl@|(y._`&X&9HhETw|f;0$5AZX,)vGE387PNbXXHlw6,'Ir'S8t{]<\uwz> Ł%Z|TNoH%bdF|B Ɯu_*kOZۓ񄖪=ݨǚ'lpu8zPhd0'ZK(Ir yI=R:"t 0{9kWa iU3ؠL1hL> 'H $!eE=rvm!XcǴ (QHLd$&F:łKsg Huc-.zN9혜vv0,gmϽh" S"$aw nMKx >E$伬𗥲7̗|5fd1$?K/'>ߍV"!/BxM+ÂU rf=ΕbFlWu)ӲI4nI_l2/*="Ȣg#0iU3!o/|Mrbsg8-M# E‹:qj{jDڅnlz.*˙Vw$Zoq^#iM[bU <*l 1jLK2%84{ x4t VuDG_jjmW#I7E$: 2RDC !3e>8ƅb(zQMr4/0El~{*Z:CSSSt N^V K!yziWsrZЋgUMs3 Gxx/aN(Esfͭ$G~[e}<*?wޤśC3JqTQ( Ȣ`;X+\$JGǛ}yuIhWys+2n]yf>) YffYY6(Qj7\f &@CJu@²,Hpuv+ʉ#`kۚC1&hzQȐmNS E6Ls;H,lDB YOL!3v0- )#E;}!К0#aQb܎\+%S g$t&6\%-/3)pjKX4O ߾Qᡛ*V)6쒓Y |rD$9Bi[+wA0(-!g\\1 $f"Ap=:aORT[FR˭L)\" @b{OQYjN#nDhg_x8>Ja:5D<_e]iK|:OYst;T0m RF0%~n;R~f<`&jͰe 2vd@%^F3`\;/rc-#N'D”"6#:FjklW~?rhPL?f/AgGG[2w+ zu bwRU DBzβ1\z'gW>%| XYHP(Q8b17 $b[|QOCoXK4ԛŒ,]tj!4I2GSC2}W0EMٴrDJm['2ct0)Q")QjM|Ke:FJ{;|v#q&DJoW@~oH}@2r{vc&+)I0|PTuǃl4~>DZ 5p9jf*+˱q! ceI?L~їaU_DKkPmֳR1}J=ڵ&K2eeWn6W g_ L2i)Ͱܤ]y1zDZ}ު}"#LŘ14ɪ7]pΌ (6i7Y~2kmUKښEPt+[]r/r`&8l# Xc/Ǵ @ QzHLdu4(%LR#1s:1wqZiVs TV[̽˅؞ dS\ פHiQN)MZԤ$zG]>˸-jYݥR.캫feW+mg2TsCʌ5Fcn Y% J_8+{qV{LC.~Oj&?wtϣQKO# ٫S&8.ՏZB;AxS`2o-@ʨ+‹Y4T|PX AR -dmUiR,+&޹lG(-5]o:bmla9ۈr1[MDݼ- se8! OdѴa@wx;$#(";ǂc)o[\YsO} T5%M-疧Ófj*Ns (k L:ovLjCF !D!GIQģqTRC oTYX~Ց((tPFFcP0` |-%2$bmR{ Wl#PlDBzED|T ` .=6F`cw0$2(hl Q稈5~$ B56(mH SF(a%jF4YtPQtֺtB @D8aBh Jl8VBTH5V2i%#QCS?E3`bNbHL3PӘr *[ED E< -a&g/s. 锧Nl8rqS43 U0ZaTO(m+H؏b: T4GUҽLNIN˴t4J%8#'z^oA#s0 Sp6ZMUymN!;UKZÚ <}@T$B$""aEJxds}2Azh ~bpȦk7Nj:{NG%3k쁩+-\, d4A_qfRso>FOrɁIcW~lF:՟vz7f^ >MVJ<x8Ey" S-e3D:ఈj&H 0E.(qCvYI(J1,,VJ\Ѿ9̢2%~GŽdmTNV*mZ QZ5~G g욿P2zE&PG HQ[M*Ffd'`̙ ?\jrʉ&v( N&n:F~9 7ˠ?v2TdO)]+* %^Fґ0хTMȀ5}AY1)f`;$0 r`q`BXGlrRMxcZad`(QHLd$&F:łKs`&T. b8㴎Ni0E`P,aߙ{ = ԠƁu?{ȍ?%c7 f-ϱ2d8߯l=,,qn6M"> YxQ.NS9llcJ#68RojRz\Tknl>6cWLpv`03 G(7= sʵ /&S:c}\NA[ɖ2wyS 6h5Kp3"')RR%nŜ(4aHAF5gQھ8|Bk4mllӐLRZWO3&hyMsmמ$Yx[\ٝk+E6eeВX@ ٸx61x7A#ݝԹ</ -]KNIgL4@ Xv 2= *̿mNɎ΋Z?zo;zMLryå*'zƌid:FD[>}^mg{ $|W bZ|fSY]mma3;Sp:ۆVv_D mFt@Q>IRѰ4_>%XW\_J1w1=5깤}FK@"8ht1:x?M*peڶ be}0ե0+CʘtkKcTc7]Or9@zqڥ~sӋ>EeYnܜl;r |egWMQs[?V.f-Jd f0Sed> Y0`O0<,+Ȃ≅b0e\^4L+֫ LF=ʤ {vsvْmC% ݯnGNd-wJZni (ǾGϝvҜNhd+0 YA^@y1(jw'--6 6y%ac"HHSTЮ+gy$ϒt2aQpV: u 0hL=2_VB]Nڱ}tPSqvl#ϩ ;f#W.`G)v%y!ZF[$f`44~!R ww3NHeZ hR |ycw$FqRa$ ">d^8Ly!<N9Zii}yxA?YkvVtPNWHBkMlDK*`I ( %4{Y˚oѣgep0gx3Wm E,W8(9fHkcQ`)FhC<0 EX*C Ʃ #rE"y$tVi5RV Zͷ:qy2m9[zyԒ/cnFnnH6EE++O6WЮţoNfPc.:"RQ#q^)FL%3ip<7^[0q$uP84,)6S^G*#R"%1s2^fiYXDN*ʓH]%$Xy,'LOaWX&sA]PB) 1 ɐZTELQ/фa (|7py?iFځ2(Bw8q"I$L))k#wY][ÅEKj=6 cojb*M\4qqZH*O&3nEWҹv+(JNm"1-a :$<f2I|L U?tQ;+#Rwy$k?o@^zLY~ӏ:Boj-M[\%VbIfCt:R'FW}"[Z?7-0-ePEEJE?Km/Rg>ee<2+ꄷ-w.Un|ru}> d'ФHoV>7~@y/ܿ\Яsnt~ZC:&.^Z)ƽO@+X{73;$%UZs \ܯU zQX;qn d=D3?ཁ_/CߏUN8}~̠͔@t-ϕߕOY׈QeQj3Ȋu s .*:뺓pIh^ԴYtWI> _4҇Η90A+W+dfm53oR1ŤZU&J2W|_}v=jog"/ ʗOwii ͠USB{ Jk}E{*OoPY?Nq#D0'RL#c0Z3W!0e b E0njʹV6N*r9#vR{adK=[Ā9n8qcRIOO|ʖv]OB [6J!^ؠLaL )&0.OTƗ<ڀ1T)0aH ^ wE%K<*jdG jjI *e糃[v~pDznU<mFXqp%'J30вZyZqc9VLF6E$"_hY"<5agxd{kIy yUS"]Vm$?z 5W ,5gFZS!ձ V8 ̫yu2Nɼ:W'd^̫yuU:]&2N2Nɼ:W'd̫SN2N?d^̫yu2Nɼ:W'd^̫yu2Nɼ:W'd^̫yu2Nɼ:WčiMpɉϙW'̫yu2NɸrO\,dxyu2NFW'd^ T)| d㏩3?V::*I_|p[kr2ʟgUg*rqhƿDp5`2[EL2Q6aMթjfQ~!.ttQ^_=t`ik{=gծ{5 ǾPe}u__Pn{3݆1F&xdѨN0+D{CJGt`ZD"rd vxI \HfD03p )H ʁ@B*k{´a—YDz  #$$h8XG!1z ",iP'LR#1s:ֆXȚ5-k!55Jb6Y,'Y˺stlvFkZup+PI)M)1ZXT4;zG9%ep-HӻΟxokN.-?Ni i1ĥ:Қ딋G?l `+Dpn U`hi)1H-h3:V9eKiyZ'  Hm$c -46R i% cB[ME*L<1ahp8%b"pc1g; 3li9u5q6Ǭ&K6}SWݗE-r{]oG>wjH>}{Ƽ+@S5fE T)8:gzLՊ"AhYy[Qk+Rof/ٰ3窨2bܛ%67Go8nJĿI0ltYH(O'q]^OGRnk0%sH&2AeGEE(IK6A0ɵ r B($< ҂7S5E N5x}4Kn'xĽ0v礹{Gd҃w}} 1rfv[ 09|M曻Qmt] dny sZD)Z2li%%Y3;B̋|[_r@H(QmGsȰ, Ƹ#aREbH2P{"P[hχλv[ۺ 45 tCg=pߢ=R!cƴB:hE!`]h-md9hX2a`+FʴHHf2ZfYJ i?E;}!К0W 2>0â`1W(J"BkCGg2XrBZ sQ `X#C#ALHyZ S4Y]76 VIK(TZD+PqHV)Nl>6Sh_^?U2 %+5{t0 5q F׾Yoag/y}\/>X1Vuֻ'ç{cg9֑u>TnA[^OC^z.U˙CQ#'6BW\_:wFŖϙͅho<}\S)+eXUknQɾyaNPifNBN2}2Ԅ''V@y1(jW-NZ< $Q9ll"W '(1.C!cn˨F͇TjY66νemI'/x )quN?RX?Y71?I)^xz 'ad֪ o/J 7ȟqC' L):h~Muǃ$q!ؾ㓤='I)fikL 0;ęB*;MN~C f8XALe 2 m. _}m==y,?LLAmhwu J;͙_+֥ &59[ԤLph< kہ\Ԯx`i'\VLqx[h,ǃ6h&{󏽻^h2U1WI\N\%i>usT\Fs%WTN~b7QHP3: 2ߍPr߾EoOr7\7Lyտ0.IO[T0,kXҳI\u65I{hHg -FT|F XJjt.J#?usRbzJ2nfLi73f̴v3n^]b gEk~Nn{oa1=V ᴃh~w?@U .6I"{Ֆ߭ަp3[f Ùa83 g03w rVG;ёv,&%|vX>RyhFu&DϾq&Dτ=gBL윉3!z&DzI2!z&DϴA=g̊ 3!zVL 3!z&DV.gBL 3!z&Dτ=L 3!z&Dτ=gBL~f;v8o=ָkQ  lZL/#Oͼw.dRsnKKw*[ytayѩS՛$ɥkZXDf.:yWx<u1 w$FqRaX acJF0x+'6&r'l>@wWe]i^ ][PwvtQjRP"4ϖ,HġJ΋!P@u^֬D;oL]]}*Co6F3nWm E,W8(9fHkcQ`)FhC<0 EX*C Ʃ J vrE"y$4Vi5Rʃp-aη:qy2mM-8q~ERWU[|NNOr~//E++Nm~dUwE=ьa>rv= s1䔊GH1`(,1N t |fÁuŻ`P'v2u;"E Rr <'#9PiƝVc 6 (O vmnW} ܫ 5׫ 悺6QBRc@!F; zI& SfpGï=GsMM(R;! P❦c&8N8q"I$L)SF_"H[ù"|"p?+1P&.?9Dԇt:un_-/#4>SUk3wBhC'1hi~pCwk>:n%(VIɻ:U$M1k0r>&sGi9#4&ɗH%J\*J?GJ6wGmK?f8zW0.qDJ@nK3oE2`*٨;Gm=Fp}](-㴹T:!q@V"ؼu[])|{WrvIw>[>ԑ~.apxB\w "Jcu=XIԑHZR_@K' ?_>~F=N$zUXiF`rw ,wϕ[ٿ+N^:e51Bu)^iJn9Z{HB%HM[0j&U1*xWBw <3? #k[TQlz MgvJt)Цj7UF%_cMyW `?CGw^IBZۭ&zx*Џ`hX4oFv:՗?Z{W-}%(à z#(wRW<3_H_1nje4bD &wWReN1sc"DJZ SDq֫MÄ@pcbZ+i'9ژ8̱>lᣧc0'o ˪/(ռ]ըBTle,(U2xi23ŃBgSa}qȜo6!lJL逰 ߜ JE!BJHB})n()&zS Gm70R1g&Z3vFv͸K9.3N9tҦ&e0[Я;'wWnj@aA2} eJ (LQk %IQ!`V0/2B$.DV{=.&[ItK򉺐ȅ4*"rp0 AXnIspO+n)9 } }V <>|a*/d8S|+3O߿x d2p&˧r/K9+m93qvQWgoj>( ?0~: .dc+ٴi8\[_n.z?V߮9{/+?!mf:LoI|/w u7#ѸF2Z7FF1A,DDZIlLhfXH>Q"k/%%\2N 6t\i9Sǣ۵qVfݶ=͞'OG2)C˛Oyl|DI՘Q138PDpsO`Iyfǔ[(9ᭃ͗ oS8o\HzVυ.ec9st Lɽ7 yi4MdS}Bԡ1+Y\uX*/f`‚)"XTt^&}:n^LLryå O=cFi! 7F"i>}f>ǽ]vlv]³~}\2bӶ`ᳫwW` yKR}*;oZ.}-Q)dy{;oZV't jJT\(*eU黴po~b!k2˝a.ϴs$0J0&  J` c&:DOVe4Ma;i&Y?@\6oR,|߇ExsL |)}/kFT]83pF8#gd 32pF8#gj 9d32PF@(#ed 22PFn22PF@YfOTc trw_OM 3y S~0EA3XRZe:uJ`?YغxSp(Pʍ"VFudXDcZ)"VD h$Q{h短Nb 3gCM{fK4퍆.K^ƺ?8s>_qJGr9sܹ臓rA@I0I/wQ堩[Txlc4Vi@ #^kzlM'-"# 4)ք`%& BQ=/lC ],iм~JmCCIߚƯ.?&`*]y.\LB'd `cLT _! #Qv0-+M vQ0qS@\JkO1) qb0J5])f>L_]rO7&F@y}_$nk0bQK+냉KM-R3EV 1:ïI7+igзf檝sk- k{S #ls˲d9J۩M2Y@Vs Vv~*ݮ )GWq9)JUΒF2`f<`DYB[♅b2q S Kj0;ɔn#A&koNdGֶ dtp0ZJ rq'td}(h^E)\ᠤ%jOK\4 E`(RjN0NQ_!ap)?)0qZ(K)µokYngmř牒>8J#_೹l^䯴>]~fWeYon ؜\FuD 9G&k--R `%3i<7^[!0q́K_upIIPayHFǤ0aN[4N+`U' vm X20U+?sA]6, ILBxG,bzI& Sfpfn2?i TKB;M3c8Ɖ`$'0eAVXl S36cG/UbrL8$%>>v3Um_Y-~q~z%9_s32]46 m>'zoImΤ|L H?^ɈD./$^6=q'PIs{sKi9{L6P%Q\c$B$I.@<:j!hjc5d6K黒ٸ0,kt>m)ϵ0N883ʦIŵ9q?`}JZ5qOQQ8~>_d'I+ؼu;>\.YBt.~ \h|Kz&/W)&+P!WDi,=fk ˻$%Z  ^޿GUZ#惿v(s,拝TC/ۭs]u ?Ք`4}~Ȣ͕@Z,k]oD*{Xla9Z,s}J zJN꺞7~Ihr_e \LW٩W͠h\PHz_ԏk~D O7Ƙct E$"_hY"xj SKpw{vPk dUx9k{C|l߶ _3c_P[|wI+3~0pnp\ٽ.;0QF#9l~$%Qޢ,iLdYuUu=j8v+mETR!M^,uoA?'=8 isc&Wo3J ^VzEmfU!KQ .&<8]~ &uD9hkcaL4vf;+C&M$5 AټOL%Uh'p+=ʁit, 䴃,ߛl8M=Lbxzgu(KRA1G ߭nQ.1?&Ԇ촚nwRIVw1UD + ˁD.ŗWZ\p 0D J3|1pR*Q+ĹUR^!\QWeT=ꗧKo~OS1(C?}YDeuuseXJRDάǹ҂PFc 3)җÝflB5Ei851 n#r{/qYj@N uIK,/D^ 'j#~P_)''W? S'MLg8IS~gE$ܸdܩgĀZlU QRba0Gɔ^;8AFYֈE?! pM}]On|0L/INGJ V|{߭o}ĉ;UByA^yRĄ:s@a\snsKw*<)""'}G":K=iyEF؁-!x  JdCp 44 Vmbc$6)0B(fOu>tލ?U5ФH ޺Z} Oso|zuޭ\`tHx#zˇ2KӇ>d> ^-cPQgflq糨дJ4'|@|<^?*g~~)s,fl?!>$w>|F&!Yo0}ef 0yZV'q17$,W7X3S-.\ߕxjev)_;HB2NrCWኪ_oƅx'"҇챶?>ޏ_VR2Z\uoV}Y.gWV _/ /LhgHA EJ۝5Pוz*({01chUo*8:A?8{>7AF=iR;ɷk@SJQ-$Hn+ZTrM#6g[Qd|C{F?~Sʉ Kն9qjVڊT'BX^ބσ~N~{=6 Laqǐ3.L$f"Ap0uV%qa@?=G({Ҳdh ͐2Zn}d*0M \ƱMQYjN#nD!Gs6~ U-vwwCE.Nl4vMzFh||>fQy/mHiJF"H%&D c(A$W3 ؏H(6ki#(bF1ZleFR-(yQ9b0XJBWpcl^K嬯`q\7wXgQ=2+Z4eJ.u.n79F2B0sB|jm5Vsl53MHtx "ѧ"2-`F)QZ7*wLG LmJl5f,`ZFL&PHVHKDrgcl95h8{X7ä==V莃d֖: WF,7F}&f7t]w=}¨D%m4pBߙa'5z d.(~sh]|jwmtfsൣYt8I$Zqݮg=xzӵM7̾yMcȰY WZtD^6eOn>1KC^C4V7?n5'Ⱥ4[TiUVe*jo1n`F) `G׮ ̮|cr:gɣ<>\Qs.~ӕ|> LbuDie wDt$-R΢`L -VQp&iaE <* s~]`֍$S{+%Raacl^vXƳs.]`~9<9>nE}3C.;XHT@8*fF1a F !P*ڈ R@-X Gk&Ub(}Ԗx<,Fj0D1r6#z)$Xg싅a,- 玤>n9I-裧2~ 7rΑ*Kcbx*QQQHX{-4114BEpYY* C'ZK(IItY$R:"[LKw0b7FfĶ)ڍqǾ-Fm٢v+C7 /WA2 hG L r2#c}9PAl1r6ap! bc/"Q"b`FHHjkbKFbS, A9J.a#ֱ&+.T߮vJvEѠE]:֑DdM #0m_&MV0PPN)MdZԁ;˜lɾ4lKi'n$mm/g6gۢ{7mͶN$iQ!;57)^nBp;WHCsKFDoAs [*l(}9 ]>%"ѸF2FNF1pADZIlLhJXH>)&Lt+%%\2N@64|9mVg{HW! 2m}2qMȬa)qL IYUwlJj"*կ5$vҟ[NAN>}=AN?OJb%o݌i|93;1(bfp*$Hq<3cʭV\ED E ۷ ~sKZ[ɎY<džcfH .@ltH(Oł㺸E"` .,("AeG益*]X(wJ}% ZGK!zƌ#҂7l`E N5x}4K&|w}avvÕ~~Jc ~qT5m=4|6f k|}LRޓ3f |N®\ gspS@jr~D\p5}}n t@fF.Q8i2vXjӨb힄EI@(KO-@`|V rV?J{lXYvg?Զ\("̄?P_N}Z7ݴV {5ȫaaiD [pT~Ta}R.A%3d^ !<`O*zq33ު'(jquz3[b g{s)=ݞ]ENg\-.M\,s Ӑeo2ߜ$T&) >19arE%t)1S)JUΜygf̌y<ƼYgmGfP;kyS1,}Wj']D=Hvm6naّB%nYʝϒ>0mmp;վ/n`DπvZ}4PTFݽC|Gmp=}Y~Y7<0I5~zխv/WA=yLTJ|:XvӭP։.l}=jjN8ޮxEr囱ypQः çˎ&śGfkDQ簱AFh\$s`F Q0n=(oqpBi|X\ @jWkݚM'/N:r^TsF=X{S(KE_*%j9=EJsŢXHp%鑸퍸JqU,^`!{$ZF\ɾD-9zJ(gB+ Xbq%qTJT2 W NtU"XǺJ*q N]\%*`Y\[\QYk@ `s^E44m G >t| o~̦ԟDҖېW$|?ul1@cv#b|T1@jmRIIiU,3k%QU"Xވ+ W#q=4@U eqra#qf7$WE\%j>uql,^L!fc3rKӢSua/?۟6UeYןޞgE$8 Bi. ҞI:4ehT&:1!ZǎpvX/a $b4Cl ^Ehub9z)aii:q?O׫AWa|f\תƼP d7?ULgo'֒~rܲ(Av(x12Dx.Rc0t1@7Ys8aZqQz)5>m u0eq2\Ɓ8p.:9 QHzxJ%=K !lJG F: lp6HF/g)Oc.*5XciHw/i|AeݢgkߌOݲ>';\#4+]y$pJgWÒ_Uqc9V,fi/9(0[DH(򅖜N)&l^>żds gaz@<_y)-mHe| =UÖj sj*&ՑL**Bumuun]&\M'KG!maq QKb ^lLPP[~rT=Yf۶ )-# kGӔQ.nKݔ圥F(4& 9H[h{D,.!Nn^ZeJդ'cE(JPM p(a%4Hvb:lnkm9yGh4BkM̨3hɋR*:d:YϦEl[->dS>vgV9h_=#y,x޳կblw9F2Bz[0ua!>h5ъf1{sՆ(OH]q5{ +))O\TU;zGmHfO~澫Mצ{rdf[qzˤl[u %3>CZq4NpKpCV ܥVҦY >+\}+?/S9% /0Et8)Neӏ߽l 2)x([a |QLE/˂ 4W8:%:{WPTp[ǡUeLV߶+p$ϦTýmQȆJͿ},/O决Χ¤v0]ݬxd·dRdXR ڮ }5fp600\!kw?~ ؆!rFf `hn)1H-h[PuyuG6d ]>ek$ѸF2RFBF1-DDZIdLhsXH>Q")XHJJd1DŜ( aHs gdpp9XNjiPO!4+I75ymo|7םN՘Q138PDpsODbw1V+""uNV[]} 4jblI תb1km9;FiG&UԲ6Kվgx8"0tx ,Jf1g܂#)XTk輩MDŽu_|)l} ZGK!zƌi:y"i>}c{^:u3#ׇ~/Oбf~tXv۪i[vO-;`טގrӃy Ü\%7ih 5(O+L@^pl  9 1*fk΅9kGfQ1G Em*ˬ#"( -J)#RHDc>Ɔ*aO( 7|_ #UD_[s8y @= Mkۘ3́#y8@$zum6Uk[]X`lJbpiAH!&` ï8d|`E p;rbPD6ABKTLlC݁Q^o?opȺt=-<JkBYAF^ll3qB@dƭ_kK'vJYӸ^\]%R`GCPîKg8Yzҏ);RI>xc%_Z;`eN:Ɣ|z]9VNm̽=Hs^m^Hksr&0=12̂,k団䛽?oodARiF*j rpkR˪ o5-g̖1[<cX-eWfcGjHګ TA=K6 vag !mۄK<]5z KR܊ t(!u}Mə}>BMVcq{È*uzۡYmSv*[3{RfV{KˠVoLQ)LTY-"ǞyTWx64 uDE_eڎ͊gۣu5jwǼ 9N~|pn;c46+IudAB: f4@}p,A!clY֯7/X@em6NveQدX )qujP{Jw 2B|\ ydNbB9ckmn XwKG9.}1.u8:C;āiLNHl(-!x!Zt%1Q$`!ILT;[v0GrMSB7`7C#1S$ c֐)1)V  Md.+풯~ǘ桮6z][u5dt1l:S"Tφ,Hġr$΋!P@[+4=`*uECG'xP㫰6"+TǢR"Bцy`2TZSATPrE"y$LVi5Rʃp-A[@8z\FuD 9G&H1xm -<:_ud<)TxGDJNcRd3'%SqXGL*H][9(\~, *CYqsTˬO!悺6X, Wǀ$C&j! Js:f2N.W(Ng4C?sGO{`Wo344X忳F[Yċ_/069x 1IGS{U|&~׌XK4qlNŒ'0dWC48Jrcd6Ob v_FFc ?W0MsFF6J-ifgET2LK"V 霏on/){V_R^E]w*Gs:K7_^>_ʿ /~=9F0X}zӓW @)X<d;ol@s§ Bo(Jkټnfe|h >QZuxI n{[kܗ,\L}>9Y}$ΦA,2*Vcz"+:)0\VSTu$U{٠|$y/)-LNt7 ojh\^> >g at=}-%2֭,TLwQUuD.kU0g>y@{4n/οiWm޲}zس 3FYSAT_lb;_m}%`(à z#(wRok7ftHO_qj52CH1" hy$"XlZ-Ԟa%1/Uk/T^91"% DTӈ)G,8Uk b E0njilUXc4qc;o|GϢǖ1`7v1Yo 9W_(A%VbR9c6([8S;gD&Y uas~|!mȒ:C?CAP`Y4Lˁt`TGLƒqi .OmRizn\m_ rLspmC{9T-Z<yG7Mx:[aE++H WϠe•IH( 7LqHd^6\^q~?F~G;/s3 /W*ʁ,PKÙʋ`Sw0 $ IG@X TuZ5]`mK\NlZM58ܛk^Pl:Rt//Tw>&WBMfmxf<G@ه=W8A&,fv5a}Ԙ= v8<*]`Ҁ\W?'7`aqǐ3.L$f"ApGW[KDz!) ͐2Zn}d*0M \ƱtFܴyXh¬Ru n,Qꛟu3w82M͛۞"&xrH%&ǜ\g(A)3 acp ^M 4*kc&PfTetۨQ1,%! jCanl89x-z|< Ʊ.xvdkG=Dgv GF y]moG+\^a/-,r~uˤB2!ErFdS" `[=OS]]$ 0G)R\%,5ZV3h>0ǁ9D9LhYRMH* gbt'8F1AڝQ4NS#JB蝡BD @<j8 ۝lXx>\nqTLn?wl̖ԃ{4Q3IPxW 6bv^qH;$ Sn&UCGzvb.[7\VSOWki+u4UqvQ')WN}o{{Oko.#w,jG r-:rAVw\]G^ݝǿO1#dvz7p :Tn2\F ;27 SY[eN%`g]TI H>P`_\2K!]xܵ~bTq;&}?1y62lG]bξ*QLD3Lmd|$0UN)QQ7DAU(Q&gIO!Z橏%ZPdpDŽbt/nZ:-4Nםd[1Y|2>r>{KrwhU[OAsmM4j\+K24sDKS!ǂA$,Ā[J A42w[)/4ܫX =~כEj>-}Eol#E\9HGe F(ӟd} !rns 7\r f ՘o "qC 4db"4˧3)iM,QpAgY!&N'w wݖjyjn\[+p^Л{#WB+ 迗G͸Cx˃Wx(Q? Vr9$unCc'/țXPpDDO?~7OK)W}?#٫ |?"|/RKo{Si:KW#WE+!5ijI^)--ޖb85峜뭺lV=j4qyGuǎxXns䕅ܧB2gRBVR(`=,ҽ!dw./ᧄh,>o1\a ďV{ގvRS.x2W^˼=7vL@{rPZ}3,:gXFܠ0>iX${7҈աw3jRmE:,9/wgL{ei"rFiEKk{F+1kYKaiy_m}kS4ğ/vrMے;l=;MJ݈Z q[ub;yNVRqrtOtlk?{w?L7g -4KMlq)i?Cxi,TzOYJ-0\5f5Woڤ̎|ZzorKdlB!j9F8)gQ&_c_J9Ɖ3<NoG׷{K]ZsXh..=$\˙Ÿ#.%;#FlMל Ai= Y%|E(Na]Au&n?LnG~bXzo.u 7l!2v"䞄*+źW^Bu񸝳KZ$B]5~w5ug ;+-#"$BG~;Nͣ5fYQV2zٕԗ\hUoo;&xO׼4i yÞ&%e@{ǾSOs,gW(.'\*KN\ p3T3ڜ \u'WYJ) QW%o3ojےNULu_Fu}Š4pY1U%\6JVrE"7P3cS }[;J f wA R| oZ X2x` /?HjLE0w4X=.zÚ>w*>}pkZM`S$V7p- IѤ\ F5SZ(6ʤ 72q^ժ)G5u5u[t [lyJ%~87O@k)R`SSY\c! 37dG VWOT GQ|o=<]/9bo,rht h,rp&34.E Ҙ@IdE @6F%D;MM( E[jHY"!H@mrpkS*\(8݅RjnQ} ͧo'-]4W*|[n3IYlM/vM+t~/Nf>γtխiw7[}g()g|§+|~j<xlnѬ'+|Zhkfw n|aThn}ȗPtw>8u+h`xzTs壻}~ޕ-iQ E J_R.)]yAُYe'4Z;^*HFZ`A vU.O\nnΧP`FٜOeq?,-~>Wy>eJpa*g>q;Wj.گt?qj-Ԗ5}ܲ\)ģbsn$"ȭV!Er/L/_Ͽ;ꒉ|iヌ`ג )ܨH2jG"E- `1Q"e-Gpr-ӽD)nq?cT-YAIV khh쑰Hɔ;x]'vj=,Raý癟̶~@eon&s/, (eV*O V'#P2,!QFyG2E-5O!?12lr^ Df${DĄ¨c"ta.&N7b8婠v10jÀڃK$!2R*4v; 2υddCK\)I4U$˵h$I @yĚƔbi!F/ IT#ʹ8xX]NCx*XL?ED]"xMKJSLN(gG\@HG(hT\] ~TȒGdݯd6-Yl١80lUUe8%HIIQubܛ`"):ƈ#Z{O#29'"tjuPFk!(\.ƕz0b T{)<~# ]:#g=C&yMo eL0lʣ84U斗,nB@r);ʋⲣmTRy4h)R¥!`Z ; J&ˆG ۙ_(_rvоN 5@yHP[IpyQ3JDJYd$ LYBOk’`B`Id5)ww,gUr+Q'1*~ Y T(kQIR>(O:*@+ 9dՋ@"pZQ[gS:%GB3s+Ip$FS"Jz#;fNwTu':raLj׌JLԠ 9bFę@}F^Ql~u,:㉝{?8o/ouqH91UY8M x^0O-h.hH{N6VKK[wm>:I;XdW #EC#*LSܱtȻlV[!tR1{?<ۋGwcG(t+Mߘt^5Qa!lRTM6?{r@BTyF ;P;MT3\_ڣ*Mrl8*H*I۠t'_|xs8z*>&cs0b:8S}Km+m" oέgUa]L#ֺC,ֻ޶ǨYG2Ol)]F%gPXbPG0@୍c!h)nG4nkf>jyVJa|+Ĉ&Y`22Bes ֟A >{Dhzn:4|qc2?v w,ޘQf_G(bbݳVVZbmd A:Z{~-HRpۅ!/uh>i"AITppASViʃ! #[#M"0!t/o^I5˲j {L2hܬ5|7dH,;}QNFlR_K!mq? @P -<.6Eb2`^ WUX/ޡ'_%!:2DEpAGk0tJFXB>bKƗ|^y 3+:w}б|X,fV+| )&=+S7LOWmeF8.0-h,/:uQK'U3_xYA*/l^'v`lNQTC" "&JtȤFxVJ${)Ф;ml<2Υ :$Iu&AUI !#zj1!15cZiS>cZOzË9j9 K t~K]C/=y& se}98pOBvWozRDS!D6y0`Np_t(8b*UZP u"A5sǀ#.p"iPIO!I3= .41@hFee;`+kSY˗Sh6FnwG1 b)bn?t\*0VB8}`Sؔet2:TPpeF+#>%D:P}ĒP1" %`wH!Bv[Ô yTJp,3$NrT̪lɃ3DbKv ;#g3ஃ tkPޱbB>FCٺhէ &SrѾUQ@c)#sIj"].&DGt,PkTS( 5A4Bc +oI1DNb< E5`!@V+6a1(M@e("2BB:Rs"tpϴ:ݼ8.ԇ-Nsw-*GnhnѼi/^汞S E1(*: ,&#^XX(qEQ-^ty&E|QR=ymGYc]9-Q^IR;<(u(I^ty2^wa֩8NYs|7,j5 ^SGȄB1b1{Lo<з:hmG!MD= um faG^Bgn_ Ag]'p٬Q("C01$ oXƻR6I}18LM n8\eelo))U)<0MGo,Y0WZo }9ܗ.j:nD~T4<0Ia{SQٛxyp(xOp9~݋`|?,?겜Oa?r~VRlj% P.*U=Um%g2F>,ysZlo~X0xʰ@U'Mum y9,_{7YVv7u:x__>Ais*WZԩ|:.k{/{sTEjmo!3'oȍtIj.vzy:hz^RbT%U^a5#< :B )l$xzNU6wUha&C/0S o9 P0:J*. Z!H BjW1 X1m@|05uh2F"Npolg<T|MF2Mђ(:;#gKؒ_[$S2Dnih-n/h>|Wd|ks׋#*-Ӳ^u˭ѩ)ǘA1I 2_=1d%FUp.+KLr` $:^(h^t^M\)mA'mhNkMF@ (.h]%({r˝z*u}6B8p? ^&=RfGVQv7S2";C.]Rѓ}+Bڸ.#xcXF?73"SxWYC~@5睏icQV/?~Pl5YR k͙P͠hNap]mf>M?㰜hXqucQd(?7+`1IKfښG qAu3sjV~K*6Edvyuz efdi ߆Z}koPyVywM:j*n ~iCòӷ<=E9j61Z6eIdg?lU}>GrFQ$Fɕwc&F ,nfc泵m}5CDͲ3,̲*I6F_sJn=:~@n29kUd b9P!}|^Yʋ,N>]#M_ka=K*,VUmڔ|g7Fd?}!yaՀ\|C n>'Ϳ?yn)YVmmutocJ'LRR7UX곇?λ;6K³g{6W!W}T_:ly`s,YӚxqfFV[MrN!8QLq$,vwUQGODk.2c;ٟi+ +4B ypEP)t>gP5TYh9sGWg%NHyPg'#h\4z8,WDaY9v&GjOuήws=@~~~o] a+@*պ tQIETWK [[28ipT!N뾇N+pr 0֧̽˸PaŵE6⋊XSl!XyQ>Z m꛹zWn-$Ӿɜg0ڔ >nbjNŨ ܔn@OvsCl SM[sQ4sXhY)&'IcnRg4[kEVD+h2 Z6( @gkw˧6(?{Ԅk׃8%:JTt4) ɂwP՜j?Oz6Q0mq#B:ǝp!`8p Ga=(άq+5Z;Js*=(Cmbw!&o^a}Tf P{z?\*vb"2I=xi" 1-D%Q1ZXnӕzCErY#Tc&W8&H]Ҟ-U1%T1wF~ވ|<iޓ;>/+zk׉]hÎ6x?,4)-E@ JXFLk^%WZJ_ʁSyEL*v2i&zOq&JH58F 4ҤRYmv.e9+@.rT&H@xF4f-9q@.< :RT?K0VolP"= L "a3mFunwe_=5FSvZY!$P MrB@8e ņTNHDy/"Znq/zp8 +wdvbBV{\|zJS"k: TA҇*f~` 5g3),CsT9* Y@"9Ce<Pg$k~90A}쾝"[/QՇ-S}x<q=.cʦ@pJH:?9IW"BFOF%aBht;"'eP JYzH-D?0;"^[ްe}u=fI'<۵ s MVAP <$IQj\BDPrBɐ%ӪSs1sP)c$pP8p6Ոv999>eh \9w 9-5j7.dJĭʔmˍޞnq5}aؖiY^;NaoTuR"|0, c#0ɨzް8iNv_-B|Z ^Or5:Kͺ+ )R'J3U4gPԊR2(v X/cVt;Gs`;^ }DPrD 5)&#QD.i<(f ~dO>V_폵5)Q!UԌ Xk yK7h`ָHaD-#r$3!x(O/ڮ8b 4_5VNm6\t YD4X؋%"Mq\w,[|^\̟jW0dr Չʳo͏]ԅXx]yV#2+w}ȜrC_Q^x̒ǴF՜uG"scas&*"XͦNo>5Ml?f0iUm~Z}#EZi{y0Ch֔Mf6k/fƟWjyKuڶ7QF{ yfd;z?ˈkry.<G+7۷n,7}5yb0\,=dPT5ӆ[l!f."Së%M^21k |Vn^YUƟ΢%YOۓ_N dS3$F3i !0{'P"beŭIlfYmocFL yvjc1/.^DFm=V9q̆+,5۱1]@Y&0X jwL6|U;YR)?dK=Ml+l>W }u~utBӸ w i1h)uL6=%3,J;uU{}uY˄QJIѾ^ Tq:$xbD Zsao=t7(-O[9uPeXd`F΃vPUzOMj2%Ƕ f*kV-=g{!c $))OTJL$ֻc킋SO't܁:9L;A1w!GjgŰ4([\>:mةƸ\D1Y8|Ӹed.=82-h'=?-,7y[;:Pdha}t}9Qbq|ˇQˎ<3Qɗv)EM-{5՛ 9s_;=h}"9 /-/xoMWm#fL{@XˆV-T?aw뷃jDz\M{/(^e\m4nl /59y؜`ٿz\oU_P j5R. lqZ6@*պ tQIET>I]>ZzSO-6kΧ(L eK9UP];U"urQ !&ji.*`wItOH*TӮc3ro^N|rlnuّD*Ǥ熏\dY }J'bws앹)s0o+b\s8w(鸛*&(RԕEV'4J@f n3}@+(.tⰏcb\mσg˸8O߮.S_T\ƪb=ޠ6?/էqCpn6XF7[t}Ɏ=Hp38>_PKvf`mtNJ{Hu"H R]Y Ӧ3]؝֮FU~0LQGiʕX7Es"YhL}%ٕtYNEK8F ͍t;BpBzaYVj VeQmqΔMƮ]o|0bwI=F=-7L-aFԈ$*L0Vh$G!_nw 4vf V&;^;c9/3jɶؒ,mw]M>͇"[C1i/cuUrI*D,gEBDV+rCu4[I_}Meae U7]gJ2*cI)@=n0Rp_Ő8`ho.aŵgkYN/\]p9]줳Aʴ0{2KNu fJC^"N-sf(\qExRT0[:/1eyuUaHaf9Gd bN\SP)d%4Ԩ5ҖϚp >DtgWB6:){Sq夽bJe&PkP C)iZc{N^лo\bOw`l=yXYbYx RfWg'&̐n"ɕ|{#o/q6(;o ! >ko~MOv:5%D~-v)g`&d\JP-{Tȓ QAhl,g_sS}}YB| (!{HJ0TBb _3W.{o&fN'e\ jwӎ]Q;tF0xqIc"_S)+-8#X(UUA.N3 ,:[5E)o`hQ1BfUMx8O|vD""vFDqBuo+ƚӺrM^8NTݤ"R謂ƶR1cm R>pcv`5U-Lm8J`s&m 4drjٌayIpq>,kbݴdW\θH.Nvd"`IW- >xR9̮zb8 iǎxhU_<\wg<|,~X^;rc4 {"9+nF}lԠ l$M`7I7CMҤIDJ4&yIC}ziʤB|(-M[`LTB;iO R{Z 2rh|1NjE b.er՚ViGT9xWJW"`O,Ν2CqGKc62|nz~~A+!׊6?Ϳ?to[?tW1oWۉGt7 zykgsb#R[v~3mͿkUs}뫴^6:U5\ae1«|T͋ۿ8u C\qgbOaJ|PQ~ uMz'/LZb6yT`m@(XK6aϣ֍tfYSͲ֫;%ߎwvӲơ~0OCysoOg:8[\X8!Y ɕ:;rAx! y\Bhк6e!UKڦ\ 0]tj}.WUtiдmWACgwĪĹIlEp1 ټ xֶ~*>:4\lo2vud,.0oNcG?17X*ij-+&xgٕ\ K2h8rv1ޥlR{cGY^D<'oMwѧF?C56abR-U)i!UH!{ @s.>:&l-SbL i``5ܻX7q.GhrFyodF7A.|e䕯Ԍ[.rq>X4mW ߗ7^m|r5nr-ZY *SZ0(tSTQޑ--NrjZҳa:TubY\-5Yk@VW0'hmVڌF+W0Ef/e*2!>gSWd(x,W駓NQ39=qǻPLjNSکSԩvT;u:NjNSکSԩvT;u]]qZs1) ?^dAG,lp;ETt 8̀p(ڛN8w9b 88{(]չ.w8} S%M#1q%*bHhK^yJv] A2|^,RfWgۧd}r%5PE w-Cd8ϾC%nzS% DOA~AO0x稼:$# ZI|K@hzk*B`ӷ/PK)KK.UgcBJdJ䂵B \mAR8 iƮX;cAZRLqqpy>?>x||#6lD&"VjٷɆTme6/޹)ؾE[AXoi} P)j3bwg3bӓB1iǮ:vP{`V$Pu~L1Xj P\FLc VQWEH\W<8:-NPl9qaY(_T+* nU:a7q6H/ڴ7`ܛڛcWDΈ"NN[zi]&le'nRHI0TLXb[rzB ܘXMmKSΤ2{:XƜI6 ٌayIpq~wOnZ+.Rg\ '\\tRY2B DOLqڅVX8gj{8_!>D][ b ݻ ~ѤCRrCU)E7EJꧪ%Xd̗+IeE% ^<^<}X;ԇՇb*Pa_6 |(hg]Y(d%BOyDb݋@M}-)?&0}dP^8\M8[!kx$)3$HlIz; }Uc{X6S ŒIp,G1MP'Do+w_G;š'mы4 RKE1 y4qq.!!dk3 zi!n&-UN&+|-B.wlAKdt)g rM(lB8)Mֱx?| zc'U|)ՕžF'ʹ,ciHY"=%p%ln)6NE`GWx1) mXL!Hxh4.FQ:89̑4}u 65"?պӜAWͨ]j;v8W}YZw&{\>tگ__Ic. esQ3E 2dSMߊ.mhk 02v4d aluOxӃ.h0qe1@+BL-Jf0X&ՂtE}ۧ]F[`W aۀw0gWx? Ӫwۻ}K'#Oor-kͿ amqŶGZGBZR(Hkm+˴mdBxѥHv:{%`k%v( ;Ju{Qt͖=X:۔oz;%<.wHC_mz÷vaPDgϳ!b5i2ar@cJx饳UG@jOfmJiGG#Y=k+“U/i=C|}ӧv) P=U/dzl~ Z=+)qf>0Y]xֵJӳ"ӸNŏWe.*GP (Q%j8:-k \鐰ϑg/`ʻ2ra{>y!sJik|9Q!rp $]'*GGBr:XM):2xrv~Va%M9 428pkE#ƣ J=s8\J,B@BbR+=F)2x#l1ѣD). ȲRd\ZJ8 YBK{o/70k;8^WzŻ0&SWXsg"CcAk]}G\M޷lH\FS'\YF:Em%oʮ dٳ|UVҕܱh%!=1pH"is\L c(NB,ˆԣ*(4tO wg *{ VvdZ{o_߽ '%󫭃bb\/%-1]P띓x_o*OҼ񍇜46V )ӥO?{9ɬp^r+I:!2LG$@ c u_^ ,LWa4Ck{b7ˮ$. p%5ęV 5 v 29d'.MhJ<"en@eo=:D\A}m6'c8WA $ʁae Fg9dzł;b嫷E][G,IYx6~^Z#T0+.FmdD8*BfY3عltI.vة]{Ҳpdpd1I 9_jEa){ü!ydFD8ہ)j^tɀHT1^asXS̢tʠO7Ũd$n1~gV u4@#hf3-}ҚOr<SGO*ʹOf0}y~|ݵ2{]Pwot4:NL/s0W,/%Y^Ha< Bl@M훹U}+Nyi*=E;'wWI]No&2j:,/wv8?O{딠z~vѝ4 @z X.7ݢ˙y73PxE.~'҃eZ_ʙKa|ΰ%itK M_٣4(2 g}YNWٷ- qiԆ7(}}t+I̧P/) s#c8hG {ϟR~aOQr~~W;D249K7aQKoWywϛ8+vbSYOˋ~\M7i<=蟁Lq˭N-ٻ6rcWdָ_TʩlewǥĘ&>ui Ґ E=H"4.:sI]L ߍק7WDe/j9Y#8Bޜ_M}u?%M0Cx;pI[ /Xg>[k7Pl#bmIgY2˒r}&^ny<\GŻ(-2bV*'pU'Yk~w>q尜CHazq9 QfջF& q`ޟ-lUm}bmBW[zv3k,F ]6C5`7E{7ϐ6]Ijo%=%U[s{V2aR!ȴhZVv1է;P d4Q@+1KЖl=tO(-o[9Pc8J 21=| KXЦWN3ջTYc3pMWz%02jNYYO>YxhS ,T pQ˰CRǴֳs! /Hsܔ u4q`xk"&jQw:#96OHffm*ꕼ' jjgLwGCp1(VͱGGBGpCpq/RmkY2ۍ@_&]$D)"+'Za,n3}rG$ nv c3:,sHZ74v_5s 6s6 r*c"5+[E6rhN6mmS_oŸi8`?{=Hp=8tXP+zv`mt NmJ{Hu H%&чBC!L}&>GC f!jTNJ9^tcs"z,>X%K"e}ol69,ag6݃u^Ih)!@ Ŕ.i(zG#U׺Σk0|=ߎ{rݿ?}TLyǯ[>,^ -.o p@!4y UJ\\G F1< LD`x/"Znq/zp8 kwdvbBVvV\ZkBCp) Y,"!RjْVUК3ə!y=",9Atm댜;8I#Ϋ(yJnkm)2SO˦-]gtod+\b<)(ERkbG%-*.ƣ{ӕѲMQXBLs,ᒓ쭓D|"a5hI&lƀ L>P V%Y2x F9ᔕ c9wqWsJ6<ޮӇ|BN[R|;<%AbuB01=^=^LT|e^n'E5hXU2cx)w$oxG!xSR'OHSd;rƐ@(.DRdgqJAۍ)3.mxs"s.({_?P7Xø`L}R=R)>G0Ҩ}pxT_M;SoҤ/=MFpOS9Q, z0Sa0uF [/{I(,\j(JCϝ2,N+ m*TۙpZ7t&F]&רX"9J%Gw5(Y ĕj~5#â/ӯ: .H[H1RܿsFrTІRE TpF'H\}Q΂ 10Q]lM<rwF?Zkew^:҅ It!57zdIltǎ,Y Ht@IRZPF ye=-k{{mt6zY7m*ü44$Z3$BG qB%$WN.Q4LjM,HB 3sRr(=N%At v;#buzL8T|TIFxJpY&WzKeZjt.S*uʹP `L2 X#Ĩ +ە ! 6ÿ!PvyLȴ<< ՜X!MF@MT\fvrQw_~TG"NT"|H$_"тdr&M<͜erhGЯښj C( ,1%N(I9uӒ%&=EI3RbgsfXdRѴHOn I gQO>VNhi6|dQ5Ή;9#jVAON3.g'{eKT2 LB T~C6}:_9Udb?/*cDzmOC$,9otsk^t[(,(+͙PrNbqMm!&w9s_},~`:{]Ѱ(Gijt[)2zLߛq>I fij5ʗ4E单Wjn%xNRզrr|k-(kwEG5l^=(5-ÿԟ~ON-^z_ 89|{SN(@*nThd,@יKdr^@n4>YG%R, |%U΢%ЧyI>)m⇩P OLRn}9>Z_gk,oLb=˺Y[e3Ɇ/uvsgUlu3Hb<*ޝGnR9q̇+׬ | a9(+rף⠲zdW3R-uߢOM ~K/nfe_^7chr3Q=ImSŻfXՕzw=+0NdZ4-U+9AnYwIxzq2(x %h^:L'fFҷzKN1TRJKYdb*z>A=>"AP4U)M>g0wb/h9r1g^HyR IV8(>ZN2N" XBiXgk{=Hٻmtz={iEo`<˃yVedx_{$WV_xQ>}Q"/7u_.EVq'~O>?`.N(, GO8?q*_jNIRc!_TlMZAPV|Vmk,Or0zqR;c㋄_TQErQ *InЦS{z;]bX8W2"J64nٯmUvm-O'?/y?->Ud4 6P&hpY^`M5j3jc R?MW('E=|9#?4RC,RϺ5jB:cJ!,}4(0Qs5"2}p!"M`H% pQuqt)&JR[R҅*y`PЩZ|&*6>wWE/^W-q^l21{?F޹9܁9A9>!>~U)ﲓDf)ZStpԖaOX>yˬ=c14fBL*T&RҨ$(ۤDX%Rrj3d"CQ 1P#RBρZBDFeOw8LH)I MpRttVf3g}=/=Ҽipߞ&VcP %1Qَ-츑5Po|3n^oي_6rbΖn xlf6;=qXfb֓k ಿN}=ރ*C ^G'I?T2b!K/I{@$N.{.~;ێI(de,XҼrޅ>eCrPdek}[?ۡh+Em2-i^ o'2LA%HGV iiQkWpVTz?m7,qSHLNJ `{Jaf {եSV~Ir$RHB഍Ql ʹu"Hۚ]aɹ( \)3*ʮ2+#)Ikf ؝v&r ʅ97ew @>"#?? WW߮.g7/}J4()U0J0 rzйH%H%_1"B,8(:{!+1*P6b:ZQ};|QkoK"_f؝yd\SaΤPƎY,$MaHbpҲ] )\P;4'FcJpXi,"?FaPj2d6"jm$b.&N5]w |:[N/3Ub<FL>eD1#FO⡸PR2R*J22)dXUL kmF rqT?n}zh{KKFŢZ314Y\JI+Pbt@gpf~!ȼ8[f.R3)9}Ǽ^xb]D]ZGYlFu*)&S'&L/>/ v&!npq(^l4g~lm`MhYa?-q~oY!]4: D}S HnȆ<~:W2J{*Xt =+X. @VH !k$>\kѕ".ٺ!Հݷ%†%fqL?WeHZX4b{usb_z q2*`m.h1`kbMY, K.XrPJM Ag@4FL!j-)ej#\BxPRb4: HRޝ!9ʛN0-iwkuhЛ]*d3EqvFSdͰH%KFkvA*Mv(NSJX!7hng,jν< bAt"9 5Sbmh d(&:K '(dA&t*Zy )b~qDxgCzy\{JOdQj[{CXI:&+cEEU@h!'Cd*1;@*pRA;3.$ %Y*!qeD8GLDU>߱P4K=LHVҒRLa;UiAb'm[ZcF?"%?Gw2NpMgrvѭvguaMj{5Ҵ !!QQ C3%8%(}( wC.%UO(Uh-醬⽲ H-뱿_]MO.wmh1b;XZdiyژ_؁Eh`nBcPX#C]Oxw\=ZSYU BPkm*։K0"j6.G:c$6嬑2WD;U+!2 (k Eqє0p-CR:p 9qZDRoKni 4mv4Ŋ?kY㋓}gOX#9T/5"u,%&+Fdr:,D+B9EJL DyKݛv6mvK)D2xV{voǿAF}s̟eurPivrd mBP/8k~w6R{aou2t`'?>t\/p|˷G2yJ E `d)Hճvۀrg.ٛ??aW.&^"v}MCnTϝy5,oQJ>v]䕠ݽOh~ő/ŗXNh!9_sq:0Kio4*L{#>ջƣ#ѧ ju$gv(]BrƭTOW =AEWtUBoV?*J3$2FV(٧P{jpMoD+Z[n/fHWwR]MiWmE۫uΪXYm˻ד<[_?7wVy[Rsi eHXA~UtruEӷw{k՞ ejP5OF_ӿYk Ժ(h3[(hb"fmqT,B99DPΔ iGѐ0JNN3 v1ۧqdr9mk=9o%Uybq{sCV asm #yh!֗Yh Il)-Pw-B}T6tN >Cϡs:=;ai))׿4:|s9t:|>Cϡs[VihGg_ VݼRbɁF%0)5fZFddo0t)lȮ<{}PK&Ľ0$㭛kg؎Y>xqT׋a[vXSLr/:7XsَTA$,A5(H![o[mHU{%B®~;gnvH^Ql7Yu6J]\J&,%U,9(%@Qf3u1YmJGr BEK/ 5I]gpU=݊M ٭erYMz^S|9}fZڥ<4m-Y>urz[ɵ]*1뢸.vc)H0T`f*d'iጢ45!pv [5Ӛ4xг=Љd$ޑTXiod6Z:ɨ +(RةQT2EV^9BlJǪ_:^yбugG=kf˯-T> x"R[$%阬9}V f >HX]KQ,E.$ %Y*!qeD8$rVV8bp3' ~Jyя_&E@%:֝Keϥ\g~=,}9J!CLzDFd2R%`EdeqtB3x}$mlfƎػF$UkNd;ZyH'n[f[|Y4|*Ii iʌEd-dK2ْ?pm䪉#Ko5y|+AI ''dcT]^qQ0a&8p Rō@֜V9KH-:)F,9S$zwOZʣ@MEyʁ"<)(GB= ,QCWk` QٴM\D,vU6#'GS[aȉldx==|%@pgx3E3FH~E3/Js% wU\xmNmcytxFݑ[pP6[hX' \Jh* &.=2@g!ϼg9EʩcAJδT嚞11.DXPq*P#1Z"@'DLa_@1rj.8+H7crs^es10ח'hUwɖCQ\dQNfg9x-n82xpZPmH %oD"Rzpyr{C`$ҡTcdMuKD'P4!>zTཱྀR0 F_9e|"k`|6mj>eYcxR0%d=SR ) sqXDLj%+2g<^t^|kGᬋd # jZ:EhewM$imyh ׭f?Wܶ(Wo_굠ܨz6Vxikw ,L S!gqpca]2+(2Fԓ@Qim'R %x_ٷƟoW3ҧ 8`z1Yc:=e54I$g]eĊef8u{wILbyi6A\K >52+7Q¬-V6i'7ehK@#eˍ*(H[MQhdZɉw!☥x-zCe̷.S̨;P<]Xy^b(4%Ry(pTd!z.$Y ^Qz}:u1ثTzK Xzõ JôѾ+U5\3V14=UL&`%`$&}>FfPMbd+HQk&^ & 񠣑 e"X\牖[iEx%Y* qK4zȴgvu=j+s6}ȡ=/\/ ,1.e-9+U# - % US鹠UD *$AV¥D Z[SۓBu\>wGY6fi 足^Y:̺ж Q cl' h/d5I.F!"W"V2'C &Ve;PI@RW8q9me C/݁9;VXS6t, nO BEM8Z'%RHC0F99%J֛@ ICt = $:~P4"P} &mR%ϕ{\ZV %uQ2%@!*GASf(6E1quGx-&#QD&!wkzލ|/5TI(q*jhEZ.zYeJDָHaDTV9/kTTnl36<1`hߌjhL#&s&h3>= :fv=]XOOߝ1X̀~'Ah+_/|R">vFd6 K_!}zc֜^&߿륌Z+L6(8"lW]NIeW[ go>5C?b:a1.juS"@v&)  fiõ %qqà>ȜkU>mG՛e2h2Z}N.WywOuc)毟Ǔ.yp1߾驦g@ TR] !KI3UTȕߍƷޣ*cYbg*ʥWbc ?L͔peіJ`Vޗ\oD*جa0#SYGkK;EXguNR{yx7~F$/%zqDOWݠA=XͰCTH+fƷ<+.^l{lCYRZUuR=?6ݮKa@ Ӹ . i9iͨYBYVm޺}Z-{tk˨cZ'Sp2O9=Aꏛ_>{8_KQQG Lxe;=_Sgd-E#$\jdh {O1X2و7PĿ7Ƣ8)i2b" 8߼3&H @!JN89;SH@mnKN_)ͪ7qsp}IJR:#m=:z%t8: lS,Hug7~5i"kE]hg$TF ^y)'L",ox]f"Ƌ!iy9tԣFcO Y/9cEy\Po7P+t4= /`mОyt-}6]ƷNW;z|^^ iJpjP+f`mt NmJ{HJr k\YV\}l*m0 I*)V\H[i71"ihLW8m5c-qN43S-s בƙPD⣰dF1+5xC}?C>b_\n u|{k{lg>ԁmW]LjJWEԒ*!E$`b0(R+K4BoH܏kU#yI*ۿ 1kd4 62nu4<5ɡ:E JQ*d*yb Kp1rK8x-+ RwV F JpSIt­kL^ q ;bQE㉷=tA.]dtjǯnx1hn5w=9M+}%{'ֻ[*mNl?{x}7P5D53ܑ`"OUd Fͅ.I@&JE;SYb1J$4\v8#a$5CcY#T9;ZoU7d_ޡnށK&"NDT6  Tc:DǕH!(es#'PbC"d|I3]@ +GPHP]5Dp/R.wNiݱX~dnnp7}F]Qi|;#OmGou}t}C^T#בx 4Díٻ7n$Ws$6q%rmR笕Ҍg~hьFx$d ["Yv*GB Uύs%֒j8[r.1 /.`gR-csQ@ٕRfc PZ#c3qGvJ3,l^rݰp-QU]\Ϝ40OPqC/ۧAI/x,@Vƃ sJJ E m}YpP `l2 f|p(MITD8L;N'ybM=Vڱ/jƨm<;KWz Zi.C.'@1oL$lhvU5â;-IJd,cMfg(R.1eTTS>sY$&L3[6~. 6ӎ=D[<\`<|-1Plܾӯ;<|X5?n~|Gm яۯE--9JtS*S,+`s J4YIZ}fǰ8v #b1m:& ( Rdysl%$*iNN?ڷY}2{OUxB)|P_2{fw4][}Y ?g,?ww|wzwy8'{]si;Gopuj7{~м;yr<2]uG.wZ[w-'v{>'vnH$gn7"nnѠ_s%R_ U(Mg.S\p}yHRoyy=0Uᡨ[1m0ztQaz_=HZ+J9~z?..zL?e)1vӎ71^g R|l\>9QΟq/R豏f[o.6.y=,Vuo^zlXuG/(/>Z4~C zjVWG=Rl2_9?`M-6;WvyvlXGWogw˳ђ{Ox NV \Ui=vb)f qJUUWS*=zJ r NN;\U)++pUb\D?y8u ¤V*ӫI~h:UV'BW 9ݫƼ:;c}m/ U:3;V~)AX_ä_rw&b#l4s-k2ҌP[oh\3]kƃ$]wEꫢ%w,o|,4뙅h":xt>PrR>n-ѻ]*uB ;~qcg4,j`4_ ;N~!G7c^/Wo6ȃEyZO>p7~grqk,;ɿ1#ε<ZyR>9L?L`tO äuOO0)7Q~Lz=!b-ړ*WUZWU wYc+t*WUZpxJdji1y2pUŅ*-±URW_!\`sUvgX& yUK |VS~<9}U7rOM6GƯߎ/): SL1yk].#|ҿXl:|Ώ;;(6m׸X/R nbݟoAUr҃FZgTT J*0 )pAhM`B#Q#2Ky:sxczȻ-g͹~\p_gh6n~\:ķ<ξ >x$?h6v2扇,m^cRP@=lDhvT,B99t"(KESrPU #:DPRrlо/v|v3DgB _zqz,_W{w)|wݴ2E}T6ۛ-͠}r~'_U'&\QC!WԐ+j5rE \QC!WԐ+j5rEݚ+J?0WrhߕeO!Us^.|F }Y(;{ҞȄj!łRfc PZ#c3qGvJ3,lbifGos.gvw܀|/nx<}?_~SRTA$xxA`.RI R@uh/  0TUaMl϶0)H5Ff܏q:XPvڶ1j.gDh68ka+R XtVgǔ"X}[fWI\S<,X~Ӓ$͞Df 2dv"SF_uca3qÞOlv阪7ӏ}5FD7  wi<JJZB%IZ&4!F!1ꂭCA..c=Aijjոi싋1.\KQC"zt9 FYLD1q(Dt1\H6 x,xL;Cmpރ axI (H%*Iyqݻo rC2vN!")Ou o 5VNnG+qJ(@p>r_4ާ3QH"!y-['e4u&化2$T߯-jPF ;*6JvP2PĹudL=c: w[Lu}ϓȯ77'7s\scKiD$c"YYsbwbĔ`Mw} ˜fٖٛ0[9h6A#<v7Oo>%&:Ԇ!D>Y,hڱXߴȴt5ײ:wb}ߪU"|isKO~TȠQr'L<;`&R'*z!ϬJmsmZO`oQ^x^|MyY#Ӻ 4]}u|kB~ht<=o?6if2}?ٲ{]eqA^ą_7|m/$%|@yl ~֯ϫEzqzs|[ U[ۦo ?zOm&ι[$X>~Xae ҮEAZ^rax'9c>.dh)vGuj_ז;ctEc a)F!EAteKJ Qfau 35ݵibuPR&%G#JYE襴QcXdejL2͖MڲH*Eaɷ+^הdmS얇.;]KSԄfzy{CuQ\7~6QKF*^g' pZQLPhYrYn^"Ϳ< b u"9 w$! &pldE,j{%Ú UPyZ/vlYk]lO<-ݣyZ,\Ng/.^V waBرRX"v1DՉI"\SKX|#fg69eJյHu`%uC&i)F%b96 @>PEh4AI$ιTT{6n%ʔ>o=UN68.+qpJ(R6%D*i} s̅viWNi`-ݮ:>}r h{- z!ت>h`qGņ$Wn)?;g\joE6Ls@lH v'#m|EݨCJ!%b'bl*rbpiAH!&" 2>0â`1W(J"B68.|ǤV+lᔰ֥ѪbJ$TL8-q)) =CpO;>'`+_~酽ý|rlFGKхG~ku ާ*a\o^Z m}v?(袀L d68`2s~Ш:^RAq9ڮЪ·oyj,y,/32hm+xCfdGۇűQVީZyJUfexg<G%WdeʿФeIԹ ;lcÂU rf=Εdsj(隀V!I [ mi% aNa+!egieR]e!;Z ʵVD&7;cD!  !ۺE[o[ΊК Gɻ[V-|TV@ydI@1G㨤0)X0'>*N-L+PU<*0p%2f`8>-YzZq56(mHR9`PJK#L EgKMy+N3ZOVQ+05Y rc#V zm19[_A ф␔RԴe׵7Ȇ&x"ɨFw8T0w$L`P-GDsLU@ll|d="ÈNV < \<2SL!C@yp:FUl\2YPvq2E`PB ݧQ x6 ;hNz76 {'m>PX ܪSӦ:,d5V^k`偷ڱ)eSy&"U3~  _i2eGhR>Lza…ۗ~ӱFicsz:xx8\|@Esj\o2o_`rdv퍳M!]}9 *9i]u^?>Y"7̋p0Ao/Zksn.>Z^Y ƽ +g9^6O`-vx~1|%>"M,e u2=;:^lϖ?*R΀N}~:$F}5۬paW~v[xԨD5*Hמ\[}θӬ'~h:7uW+W[{r^[LZMa%EXEaL`YTad!9['}#g *^PiC D}giiDQ)!pEy2PB1QZ KISUÍJ,R̓e wW=?KRJs5Bqrp̸֍'(=0h<,3M:{l?5GH((gElr̥DJ g0Gͣ<vTwhlb[g,[+)*W.R0TD J\QNQ*Yp)̋ D J'5m6NGCJ9ō5 [EĖPƽuH ÌМz%hlXm *tJZURuܹ6O Ru.V,?1wXi uQeV_}$}P+,?ȹ̥T.dD68~<0smCp BC러IT!bHbnvHŊGH\穜e_t2/bYSLnz>"MDYժ9ULkfvkf$M?gtX0tѬֵEEa&e~Cv pNuς' be՗ ̮NPkp㼫 ~Yd *0wHsIMqy~ m;KGҌw:::ǚMԲb#M.FYi*_nxxVQ 7;68dδ{LW>; >_lٚY G1VpqKL}o)_P*f`dcs=v?[FftӼr%܊>58y`N"}YHZ\K9L$ ItMVV8- &2lw!IWd_v$y/>  +aQoZh9p~ WTOew5zxc Q.[EdFVuۅ!P*ldhZԘDTke)JՊ Ѵ*MT c-h~kHovV[!"\Epo&giMmNF51HqSZ[KCU![%$`Ҙ&QR*S)mv>+7j-6y# ՘/nmYiR oe# &e76iRLj9 pT|>ƮfPnf0:bh!SWo0pfz|/[icVBImw ۽tFix9w F`?80ИUV{mv ur>:x! (y"h_Q*zE!hi/Nzu 9Yr6'/b|jŇ fU^ܑTRZ  $Y1ڢD*ZGozs*Ôx8Eq۬,Nin5I:'Eէ (хk§tp}E")i f;xp+@]+(ɀ kCtYviF*Tҍڪ-h'r%5;4HtP)HH Ш3) x:CaE7mԣdb [W6b]M57`'}|W F.LʴBE9Ff,(JFb1;L ;ցNi,+~(]ڈJ5(Z98)pc37 3h"ǚ4&ELIgdݼLFLҐha3\BvZx:-}t^/IfScUCƨ<"Ǎ >N:jNÀe+R ΀x=`eR.$ N F8@9#KD9i9yHmM곩\4*w3. L/ c0&ʥ"d$8t!St\-)QX y(Op]ghuECޙ"B1@_mmTq7/[wW28UIkglQa>4SulEtHp]FgM*t֦D/_YU?fB@9)4RSO?@nj򧽯{_Y:ߋRz?/zIA]3ïV0n׷%/|0ioom~4yķ7-VdB Z^:- +fZiBd1^CD")]կ8OkU Ǘy9;\̗}vsyX7E J΍A2^9ݷӍBǧZ˖y(|'tE(6hvQGǂ ]xȹq'n'OǓ؝;IVAɓ%,oT&˲MWI?{ȍap`{C 6 vv0ю,M$+v^Lٲ2HVW,VEUPF CY/eR1 _}~mse<@a0Hn>~7hS楡`tiu1tTTk3$y)AhpPm*+ftQcD΂n1rD >u8,S2@CiqI>珫8l_ GnXn.?n9KNFD $-U-im(AVbTEEE{\[%${< {#hVx;ayUp<'D& ZR΀YL ãf2ApcgaqSmMR!8r4uPfw NK0%SC'Heb5\s,2)t-ҾԛYgl`Ɠ*zs&Xi<xdYE4 sG\_EϙF9긫٧ /[c|'Vt:}\9gHIpF"mofXJ[޺-> |˨n۞x#>~㼻q7Ix88 $be$penk) u%MUJfN35TUo913^2\Dv:ak׻og!ց Ԏїal=.T1mLQ@D0q&rIse#uilp*܏y\X"T6cTT&b6u΀gGL`&Z3& b{W Ǔx6E-X:~H.oY>Ԃ WC||% GXStpVNY1'o5Cr"G9LhYERi K TzmDX%RFJRJX@ 1P Psbo@Y"1Z(OF!wVz=SbPh6Z=l0T~0r |nzf{Ǝ|u.R_sQ vu ˬgm㔶&r+iq unM[ν~6ӷHەDwwuUz{8Nmv:ut-#s+?d𷹿NyDoZfo,EG 5-w9ww^yJerav r3as#966wnXoM͔ie[d^%͠.d}eTU0(K@:]X8*dt A# Vx L_X 6!IH-aAiI0TKC *(o;1c؎J<re Qv1DiIk -"zj&I@P0tv1rvԏa2DȊT4b18T#tӈ8'R+ &9Hdu#s9qbzk(!J 6)MyԂg|P`B@$̈́NSg#gFttJb\r^b#"ZaӤ0\(J[R0LA@&e8)@N/C/>,CNö#/U9n 'Q"Y7(1 l K8&iU te4Rdtmp<^Ÿ&剅>9{(@/Rh8%HIIQubܛ`"):ƈ#Z{O#29'"h-j-!1@XJ4hYPZ=PPY$/o8])}I/[#)>“T^& 4Cp)D.X(VFC b$0Q7ܪ$3:mH0'6 J7s+i"O2j#TH)+{\Hr%9]֖̀pKҠ' H#gO9;h>FǧRL(Mტ>ļ,q[Y<0C&PCO?35,u2Kf4$ >$"Dt f'u8VmSF)8o ·\$B$QAA)}f(Hi gҋNqQ'UOMU xNPy"A䊋we4&MPyjAsACsjZNyD\F9Q*" -,|{tA G8U %Ner!q m[mIl'cambrtHRV:$=:Ayc n9TuEvWP6$r< 6y`+zC0-+½&Rqs-*b֥]Q^FZ|֭,?eWrglURp8yaU-W7z&K)efi-ٽVד~-HRpە!TqK f5FdcƋШАH1aXNۡ5gR əDиT #Ⴚ Њۇ:qSR\o?L?ՖB9ʨ~ N˧I5e{Z*8IT8 ܀L+4Ar魑&f|"ZoAvgMdWS0NpfŮ{<Z/-q>ؘh& a[zq'uC5>w~~?5;h(>ݻ{͖v5qyDjmo.Q!7& R3@{OR.>ԃOԖhmR0.`[u^rÜQ [XcI"kCἭYQۚ@BhMNiAfXlG.XQ`+(,tb! MIoM/l+XZ6n 1br0CӰh σ1iXJR^DDž҂Phss`?lu>Rq9:P A:\zm ށD&!PW9M2xK*LjLi!R&R/5eDDLH^{0<)c"-+-[Ѹ|]-J]{aǠo7G6V6.xNcWMer&3OhxUGkR2.%(88!̆V4HHhIVh<d ]-Z+1" !sU+ TC=u YWө;ߡm!nM YgP~|:kDp8)aGkxN赀x4Jjsbፕ s2˴bP*lx17Y_"ĤJEt=ۊ`)t_o~Z ?5O -aM08s*۬-J?AZRUr xr7NqfTwa#]L'|S{K=/u[a|kF;M`LF@,0cWXuQd6>Ds50Ub=~/Kn|E ;k{ =Bpf?\* ж`ˋy}g]"~n@Wׂԝ>7)?~*s5 {LALZu!=SH$08 7:<  Y.2F^4|ZmۑEa~K7kcfrCYC[1c6lqk Mqpr`:$Y͐ ծoCd=Wsjkp_7srWus(gh?'>sXUV$\=CsRJ ~4*+ű$]7W FNY+)Z*մ?rP5콉z[~^lG:3LxQt3^6Ʌ1co> ɴ&ΞwQLD=3Bhd"yd;b;kaR0|%gS4ad#O ƙ\3&rz\ݢ:U~8lUGy þtgO_^鬩AN3.ϰ> c)~"%{rXx?'w?iOc'<9yONfOnߡǔHI\1WI\J\%i%꺹JR*s4Wuu0܆0^7~eF Nm;Z]1_3 d hWW kMvic됅hh[|kBSų*lq7rK s5=#><^~#hTcN(8(AWacd l࠼([ịUh4leF.6`#*$;A Liy&|',qv+{5>EQn+gT%\qXɣ3[LY͢Gz/f]Noൺb&z@uC2P[v? 7˺ou'lǃqa9rmFUZGzD"wymyIvb n&q,f{$&ԁs[X½3V!7@'BgLu9<`O3,'֩5{E, \9M*-lV"+(Y40 g`rAMq'h?la$ݥS*D7uy0ߑ)K1cT)1)6 3;US5ث𚢋||Dl@=!ug뉜.jm+U ^$ZPedKRhEGL(t:/)~+ :C?*`dz6"+3ı(EP!p ",VT%[HXG"\JlgWYzs]m}yΒ=j#kw>]~9MȚaYwq]"52*#b)%>5 \kib*,1N [@xxC9 :˺TxGDJNcRd7'eh+͸Jb,aFM* NwX)T*ui͔w悺6%Sc@!F;^ ÔIr=tYkie"9-BȠ%iRcϘ8mHNwm$9_i7ԃ2`c2\n~F=X78Ӷ[dw$T."YlRh&>vԮ"'B-CX>s&]R\&h>*Y\_\9NyDbϾ;-ݱؽ?7p}iYo(Ի޴iOͻ+x2~G%ǂ7frfP(bg،b|&nn|FGG{_BmӎSo>7Cf?l|cMl=S*EӞQY̻P؟Qv7RȨЗvuU&KNF՛~#q^&guɯ|.xOewQ}R5qB]57o?G'Ͼw6_Ez{[a7n٦zO,Z Pzaw9r)3@K:]ׯLSaYd vYy]6'>_`s~o%W~gK\V7p٬ˊK]lK*#NVדE\ ߏ"igV+[$54G>bWS]jaHëAYPޅ*Q2t,NgqK>[ꮡj3U xWBmAv;IcֵfjZF̩[igR=IE&Au>]~Wß_T ƒ03& txp|ˣ!Uo=D :-<7$̎}s ij2lio׆F`|,. uJWfu.tVFb\{ѲR 0r<( JAD%|Z&8Cmd1eoe_O c}1…Pc;>$?C4'>BS,A KBZʹ Fn?3N0פETJYSB/"BIkR0v"[[lVT|?,>К%@_اa{a P?NIȢ+ޡoAl;YOȹ6̼r*OMY1d1YeRvA$ `"es>H۳UZ$X[BaNp4X+f5 @$oT^O]A ΅WSfY.ߜvx)s%~Je FkJ+32g'dBR!hHFd2v̺,sحp\}A֤㾨mZFmӡvgx2IBn5k2 >kTrQtm ?+gH+rVd*f *NF"ːyQ!IdLiFrṁUbjr3}x9v], 0 "&ED2";D\'2[cTgAsT< %QL(-ᖑv enקw3 o $wY. ΄Ԩhc$OZp"͞FfD@8f.S5)/.quڂLZp\D6y`%289RhPأڡ wI=Pvp2ߝpbzA?4uIp]d?Jƌ""=4?}]﨩A pƀpЛH- Baobh/4v\q8+46İYWvmg(0C)Mv9u}g*j3~Nڹ X ^=/˹_ti R̠NeζϢ F\'Z.1u썮@FSYl&y|]S{^?f{S㞴<&k0V^Q,d&rB0ό 'fT۲5cLڗU9hR hc aH79[@G 7T+w_rI gд`z\Z6;5Q:l|gR|tsJf`V I润Skipg/ɽ,ZP2W,IZ)p+}Y%#dkHduY cVa"p¢1Iʒgh,Dwe=k-l:S)n8k.K/7 ]i!,K"KeA:Tʻ  '!Н Oz ~d%)ztܫ>!1^^:ɸR,d) (9N:'=S+^;?󓮵[d1C,%BA+M5$Q"IaP,p{5x" !WVol>Ȩ4PaٕmlSLH1$:xmEDJe@k"r_s-+߬3'C352٩" )䤐mk:G:hO=f^)jMB= lWgӐ2+{,׆DI0&$tY䏔EArPu 㑔=^T_mɠ*)e)|t|Uy<yZ`O㭂4yl]Ǎm׋]˝v#)fvfi-1孟VGZ00+dj[zY)|9Jck!i^i[0-ej;ns1*>y 92It\d\gƒ jГRǔ`f☀1R(J ;̌`h99J>>w+:v-Uc?k͖ [״~:ђ:e10Z eT)&!qf&@stۊNi8!mtn%oɿؼ̤IiYƴ,N@N0Gp?b'єZOWKː_e8? G_F;'oMF.oa*[j·QTcJzQS$]z#aagD;ڟ9:nܣ&}P]郲i OvN.T6q3fg4JIE jl{oԁ^3ou ^5;cA׿-9e~> G:H-hZ:ߏrn8 LzS&*@%Qo!.2B+S!\Ad dWYRPQ.j\4g Z \rTkPQYW ?*9z-.{ Wʅ\YU!3Xȵp(pE* Bwpjkoux :ի^SȟQ7ϙH jP*{KvnB;Ӳ(1?IT&P}}ɑA 2 ctHc>`\>6 ]]yI1U@bt-$Yd!TY@2Z+ Еn]HIa? Y4plxH>K' b$K[eʖ6[bWUzS.$żg/7‹ \CB΄[z698/dI$a2z!1C~\e҃j&=hcCa̷D~'?0²B}em?zh15jg) &cOG[B F:a8FC&RB4 5H1XF IyMrʌq :΍s{, Q@FϭBG#=U6z5qv[q+tW^}rDbMhla晖ia6-3Q;.%zf٭v2ՌE0n yţ!t; %/t}yDww[P?tJT'Mv-q,/\יHv}LjPn?,|N'vxorD{'6oM<6]QBFB}JIݨeM:EeGxh(!fo;P~:ճp KBG)y\<g`Y(\`ʞstK$ [l}}TuӞqpcw8>oaonjf' 'OZ%A2"; Z2c LS]2Q)Ha؃c3_]Jqj:qhp\:vh> ޶u@bZ.lE\`՟PFI6F'i\L,1e(r@Y/eKtʚv=塭'T/?~i@ &U)DTpkYXZ% &hJ`,!XR)EECR,:9?Psr,n5qD+=u8mi%-.7r4ԻΧ|N?W4'Eqm3hb [S199(H7xa($:}GPR]&Ea kQY m&,,|B4R v*)',g-]ƃn^)rmdD jwzi |]Ff5J3) Ry>![*:"")Rt˱L)yYh[-P4!\CkH,,VÍEKLoQ]bx>˯*]LhˁYE.h\Hڱ}qzv4^er@tvb=;[})H$'7 =E7JQ'LlJ_W>&cL]QޘKb=FK+=yeyPz%=;V7 [.+n:$t+ez՗>g+|w۷Vz'r ͝tٖGg?/V 7~Lm*V77vS vq-zKd =O~{q$EC ,zǷq 1 vlo2ʚKd72Ef]G׳䑌= v[IZ쮖o_?]j,CrHFk7_Sa@O]=MMg~vK~ou=س ӜGF'A]ﶿ@w?mUkT2ӉH?n:\$c{A śl\!K™0 e4e[iKlY$ 8}UI7K5WzNK6c!oyƔv|uJ.6Zנ&Vr<@M@M0J+h%Icxy>tXbF@AL,zD/ao3^Xm]y(MߨL'1'vH'DPƣo)Y?Y=y B,|l_믹ڑ흜;'!:zH b/T%OŔj+JRJi8RmEJ3K9mhAA~К16FyO("a @>е*]r Fg. "T6Y(!ʬIx,GktJUEFg pHp QKp\)"\}IO Le [M=7:'uV Y/oG춓tƱ{4o.CN&BL~ЖqI@d9B@T:O& ~Q tJ;;ʡB1kC vǎQͅ~9cic!;$Ms`]LO0Y)%L9AY 3.pn2-*f`n3H][B/" -7A`-X;I8 ?]uSH'%n.?> -=|EwD@}d\C ޡS ;YOBsCf^k-]lN)@{ eI* M<&YyQzεTmXMݞ^RVӌC} 7vsW%2H]sz} w~wC|6(PI3!;%RU'g*e.D#/FʺuC{> &CR)F.L؎˂0r&G,>h]M;L'i b׮zm[kk6iփP |VEdp,ΒU)T!Wfm,=4 ISd(ђIL"AFN,&YHz5qv/+x*~G#q >Ǩ9ϘyZ&U!R{o3ˁu2\pCbz1cacJr2Y%r3! n%B[6u@5qv{HC/ΖKYMK_t_Vh&A9A)H^ Ismu*€b,h88cSմ@(Y]h.bzUiےg~$_dhfMΦc\,Go@sIQi|J_̧u2 WQ&,t`DC6cf\O/=WbRkz=+=GvQ<HI2)G\Rp̵2QtŸ92%6-.}*3пoWzB,NQ:2W^Ӝ ތFߦNN ~s~wcɗzOsW㞎gӅe:sI9i =[7eGrٸ^'\+^ ;Uu=f!{Pi`zG:JUgE% ޚFhI444)pKCUgOI~N@Y~j8$FԍF{Y9y!gֆ:{%F*1& C{PJt!8ƜVr۠U+k{Wgv'Do<=iwk<(bvf,htKẺ3"\ЄT ` 4e&N&˭f͂"Y`go!-D~R:X2ƟKc|-Y`8Mh&E*u|$ Q7@_ɔ$9Hة9:#f35;w άbuW|8,U:w)Rc69\TKה*mբAɢ*39ǢM,旣??"!rF*gPvbri{RsF ;ѡjdCqyTEDも%Hƈ.Ti*mS+xDKTƄ6%T8 ySԌ)ʦJI ,!&"-,xGp# ol5F[Gt>n]rN_2Sn76T9/!.+ٌٟXN)TE3UA'1D N${癱Sn*"H.Zd+oZx;qG<&lնʠݭ+v!v})F>o^I#'纏9H}i8Gb;')I-=&\KMB6LsH9D1(l<~?Dl)6&LwH B p c1 aQbҎ\U(hPJqTQZGEdw9lVJH| hlpo w'oo§:+ [yI;}dKd3l/7˄bz^l4,X%) "g\isf@tƺ56QcS읱XOj8zNvܕƠy:\zm`AJU#L8B& 0 [r9N-f #z!UBXYL^jʈh` Xy$RD6#v࢑5._Ei-hkuq8ۭ?4<s4ڱrySɇ'2!zlњt~K 5jH!dBFP m5N4B}*oI1x|*,5Brvk L:ovLjCFB.l`|'Y w4cՁ'd}7> n^aR;)$h0+GũeiBZӌ&U=+UOy[c&jMd()kV"x\,f:`lp(:k]2iD$sݸ^sMq=xR8hj=qXEP ;HPg%6Q [S]¶Ƥ~Xo~.+dvS !Cu'iϮko' KM($U],LT0w$L`PTODsLU@jj|RMb-͑|_j(ٯ|K4ۻJSO)X/0JVƇ^KMw7K4cs'#X_0g4] pt,X2{MM[}ȄX8qY \W~K#20LnЛނgw|)S0Z35>=y8uLߛ|>+&dn޽N%c6ޜ'W^ `L&&\/&`dϺÝ3S_~v!b&k0ts5E.qvQf=g]r~lT0" WEeOßsuҿ~"R>+-JDWxëեWHgZfyh wAbg [5 < X!m/i]-SѦ{t8u/S\g\>+<=(RL?"Mmg:٩82;ROU>Ԥn~ +8dI8 Coz3p%3k6VRz9gÍZgM /|^ˋ5,Ęa4Ϊ7S^u뙕ǫ0_l0ʜ cܣP Uթrݡ@@#epc1;&ruSPTsoǷfw_Z3pԽ}p$~7UkL h[:6~3zL d5.H|\yQ:^Xtqgi63XHD4Ii#EF痧/(IG:9ӓU8 qy1E%ag7PĆ(c1遃WLj=\Ɍ?^D.\g8E0Er}H4p?r|QKnޏJ&jOFRQ9 Q, Y* 3͓3=kqSF>xY݅ š] hAL MZ5]w)qMiVR& 1A}Ă/ TQ͍QmQ'(߬I8;9gw}Ҙ]lGuaÅHawIYb FG$87=0YOpc9-r͙A6ƒEZmY[I QRH/qb;0J!JEaM&UO ל sK͖v113z]Ik?ʁC?[`#g\W/t"ؚ>dtrq T@f%\?7`rQH0~p 9&WO3JHǭԢi{FfHQmQ\XK->2rG.0F܈.ʸ?̴Te}If=4=a-y*ɉMÝfTj"pp %)\ig,S)qlc1VYE 62NG,6`#*$;A Ut4,shg=9 b[evW z.~OQ8kֳ|$ϒ(C"V|aHS|_|K$,T3uBͰe)2l*deo/Uh/z^BwC[\4ةr2vVneoR,Oۿ"RzCn]]aߺ~Q~MeQ7L> u1FyμpU5;P}oےDѾmIjyjE<`-@+;g S Jd%x%1 U$Sf8UoXZM2s4ycH ˜w@*ǔ ǁx7#{U1ت$t|9 swA-qbm:]t1ͅh$SDKʽ,gIr %5Ov+suY~8n~MOx< oS(bAI1CZOK\4 E`(RjN0NQQc (WK Y Oc*xFYJyʢK[:qyְ9zS>v~Q ȏ DUIld4X+}u{uӲУw&)s1䔊GH1xm[dLxÅ>ڒB Y7*#R"%1s2 J3K VAy f׭U JB?URzu i00jӁO`dD-Q*p"h0e(?껖jiD !dP4 XPkLpqiD0HR޵Ԭ mXll CV5$kٿt"hL-`XAEzL}LSb-N2JlvrLS%%D!c0E|;9N ]$b O!?֐ wq+KjE=)Ǥ9}L}Gi9G(b-єпAd6?GG!;IиDztg v7Yy3P0 !ytQDUr P(EJiU6*rO؃Ԏ>KϣVu6ni49`}`%;{勨j1]r`lt`6 .$  Z9n"f 6gRc2k'SӡxƼV[ YXZC(/g(_O3RM3/kˢxڀyqǜ8ZB!pNu_ ˍL0Lk%6nO[9{!B:̗1K$iNI5OZq'CY'+dJۅ xƉ;U-Y /Z41~vO/+Y5rvtP*xDc2  D& =jD<@!bӊ@M\ ;.[# h 1^^zI,B̮NI"T+Z#GNjoYL2Ē6ag Fd\h0tFI$ȡhZUO<2WUe xN4.ڰJ] , s 5=Zx [[i!JD},R5y7C!3ByD6]6)H!zl6x?oţx̕:\C(Օ;D',1tPfd+%e}ȉwet[ϣȝ4xH[ldklRnwpdGVz"MK24Iyq f$vq "43a'*[Pv9ã]~x>fJ UJ$s,C\U\ow YՇƭĔO0QNJRDȌZaW4AIpH7w<,=Ži{cb72c?8gt$+2Wth0ZF e)& q;2NbVީdw_Lf]dW*RŮ7w跭}T'ٛ}C1wqfLS VK9oNӹ9`茪mܨ|4Yr.^E:ʫ|ޡ\畎((uLA[c3I&'%LH23aU)ϸғ'sj}u|ɷ9g*=޺YgEzOq zrPa[vX[UT=-U=2l͍u_ts.(zQxѥ@a(=-J*ifSCUƠ.mOsbF"AU"+0AdqCȢQX]7# W&ctdڲYr iK%L$r̅FΎebs1/J}`|t`qHx1۬^ 4B^q.+xydMF#n ,Acrv:Fo_Q?Kt&!W?}?|!\;wɓY/~I߻_o]Kϋ|Whf>\_}G$`4%rDJ^D K˼KN'{4YRr~UuYr3fwO/{+fy+/fF(h\83igښ6_A)I/rZok+NvSň"^lˮH"%KDxEb`<7sc~a(r,upk(_K.͗|a+>\V 5ٰXbU2O fy]h h]G~ϟi!gQQ`0z9=E!;A<-}!h9_1"W z MC5;\QfXq2Fʙxza2Z&JϋL zbqouxP-(=TP2)+P^AWˋwX]0FN&ǯs/)V Gcp$RNv<|[&fxC]UJծJώ+ιgwwhP[㎇TFn2n}EɰOg d.&f^;}8rf렾`uvuV6 IvnS=>ܾ,uWA ,2n9a]A$=&N Ļ6)q"nǧɞ1c(7`xf{O^( #J{Ci09R$ DZ*rH&gC}!qQ.e9*57LzJ}ᒓ&.;rE!1rlf5 :zjU:e!(d *Z[u*G;5Ƴ\suMWoALl{cv~<3<==r^0h sɽ֔`鬏͋M΃\D~l#NARt< lK\(-Gsllջsn-R[W\ΖY :)ٷOzv{vMAcw]zLrq T0,SdC_2bVDpH.)WY&!Ǘ[.{.A̮U v.=0ЕwBm#cLJ4>K?-1AKުKĕ+INtDANw) R1$Pk1R\QHp8H+  4Ҥl7E2%xn!t:ik*Os14mWkùBx́Nj5 `._;s_G[>a0S Ѐ 9T6,[nT +Q`ef&7ZB] ?D}HA /Py;GV3{"4?z&d~o>){ſ?++8QBrWqu Ѡ%%^gmiO! TjPn lilAHuR+H8-:Fc()xMRuڃ*s MvӪchp.h-KVE@`o3FzǘPo9ǴlY*41Q!N2ED'K.xVJ%.ĪFHVk5 yx_ml$+S qi ݆Cx7\w'2%0mA &g4ipNQEmPBmxgA7,ff\evvOziN>eM4'FD20$bNeA K`$ SVqz _}7KvC.w6W_CkKm'{?{&}^soF:!ҖÉzF,WHe"<tؔI.:;G6UMcv. MtZV'~o!DATBKkZ) !"t :băP%Eؔn חe[0.?ߎ2J}0x%3J"wPIlKM~l~ 9\}] Lje7IwPO=^ 7 3kd[_ʻ\2++| Im$,Ρ}uuD)hU 5J4%. Tf;#9fYHYz%zDxñl,Sp( ) )) N;M`.`9 @vXWFk!hATB%L7]}$G^^~(.NtN&i>J](o(VC% b$NӔ#JZd4." )Y@)A%~'u~ByU 'L x8$ U gnHi҉pb8Uk2qV뻎l8߭ :4T{JH\qU)5^grvo%Ɂ.QsA}rqjZNƱ:|Г}QDg[V&W5NU1%2a9Ĵo8C'BBpN=S+pt/њP;$iW;}cI^X=]oC\RV]E LD4lrOd;2?Dz`Yl6sJVB*s]K내Ճ'x_U27RO צe ?ygmPɨ`ClYv;Vq; b-'we] ẸvH1(X{V<\!JVҔ,CSW#( m?F}>݆t՞6[wS:PcLPUDcSNU ,$#NXP\UZ O;g8l4=ylGYEc!Y@ A#eDԠFJFJwYV,OYߟ -z#` S!r3m\ A,!m,W6DH0q$ΌQ*z5# .~~&B N{u`U:uery>k\Ecs?0=V6g|,x8j v~ZӢ+}YUyqWo REo՗l~?^śp: l}iNzF;1Ó 4yy~98XwӍwpMWޫ_O֬xq:z"ǜNBHN2)J7ډJt>oQ} +k8 WYhTשoWإ翃5< VsS?r7><3y3o?x>Ѓ7P,Ym*SO\//=G”?V} 7V].*p'}gV4ĜL"~/f?[L;ZI!? `~8aU'.z ٩ }ZקB>sFb'Eo2\?Ga&M+d:3teMz4nŪp)l-Bo[Ô J&E}k?8syYn &ȩp8z#%=ڊAN[%|UiE< ٹ}k8eHH{W NFq<8Wj.MzfJYr) [LOOMwUu}UJ4!.o -}.9ܗNrh}$A/gl\3@M=K3lvަǼk6qE}{ESAª{NnB& OJ)SԌtLWQ]C%"+:hщ9` ͬĻ)q2E<ӊ~|dJˋPjiP@wicÌ{O-* Q#J{ T@ꧫ>ޞl"6~X^KHrlRt<*l,r4:Oޣ.POS٭NˢT$Ax <]e?ȱU|j]Lq踔>\q,)H)C2!H4&ܫ53g#c]ތ]jcv.]maP@ pFNF1h# :N+g%MӪG VjOޯW:Q±(NY^Kʕ_6\߽ `{g[3L-QE?ä@[!nv֜+(n}ɫ;8inewCD}':ϴ;@O<ة*i~ߦ*.:OG*yBvy ӯZ0"w'sղNGTjPtGJ)H-^_"~8覠㗧7uA;1N~*i[YkKI;욨l'1?o)?:| '|NK)/&ms2 Jcp,][:%5s1 gQG3Z1'j-.SΤz"K*DC@iNN+'c:JҮt4RHeKTpN 5^mW+2bn` sI( \jQZ/( 3 ;nPo+sR<_H\)gc6Vgԯw-|(CShۈ>H hDD0q&Dm B`uTz=VvKWxCeB fRe6ra!&Ú% RTkƤWܢP%Ol9%N; p8Ik|{V՛܀+:dD hEe"haFSIrkZh\-)ΘO%ˬs;ly)pgl љj8 l#pkQ*M KB\)*HӖzpEHұ%cUb]^uIFϏ/TG.IPsuv1: R5yOMVϗUWU%S|m-v}"*揷+b s 6`{BpY`yitc2$ [uL5ۓ:r&ɱ=iYaxbZ gK|sr(RW-h*4ByD? RR,HN (EDp K>(guvB{^^z>1E^ Ǟ:]O'[][+.'eI5Ii|P'J;]yP0ճm*`Nc")5I.F)rB5XTH^ ݪK\ZSvc$pqw`@kq.s`.RkC`H-#ڜ3_eJzh( |}g]^s{Zp{W,[>ozZiʹ,hpu&R'&A<@2|0,Jc# AVbTykA )æ-2& |lmEN yrtgZ'/4 ˇׁ +!MF@ *.h4JQw_y*g&_הPfwz,1k*vibksXdRZgrCGl Gj/ ]{48;>/{qsbꂟU쳳+'W 2掞L8U;Ed/{y}wsR9#2KwuȒx>!W?z̚4(g;(6(h6W=5(bWI1اlgƓ vqPE IP]3 oVQP&؟Q6kyω~&Vto] \gg"=%q&*Z}7Ko|Qyyw$7zn?7oyW{v8:yл7_+rܻ@gPrRuu/V˅(%u^29m oQs1Dy/7j>Y >*?=?ny{;'wM|?CpxxI+ӍXg>v~>gK,oMb=z} ,)7 3::9qAjnoj$e~YDWY@^X&ax=h2*wml{mA,.ﳤZU&N ~W?s}}~{tB)ٮqA<{= '8wWz,ʻfXJSLZ=+0Ld\4o8A?λ=mH1`e(x #k^tQ)[J_78uPeRJMdb*:/\:s~j6iRbACNԫTY91g+HyP3hQ .N=eD&g mvTc{Kه|,zlƞ(nL 6O4D0σVf]q臣\쥈3ZMz~I+E? G7#Xv7*1Z#w!E2?N.hKl]}n]֭7w9otf֍Yu8=pn[=?(Rϋ6CN͛ oޅmcnq׏n9Ϸs$0RyaGP hÈQ*d| 8.i-D+dQcr6YnU+ohx[OFX%֠bmШ2+Li#k7+BH1ED 8l :Fn)&TRmkٮ۳UZӅqƾP U  vs&W2^z|3MnP~x'_6޳(8Yi!xTQkKDHQ#ѣLiޮ-ˁ'Ð pgYmrN :!#`C ,J>&WO IT#liklׇVHɊX4bkF5"tӈ8'{Ii)%L(ՙ * &˸3q< PB4A qyQEE5="i]moH+q%{/a3 &ps_mMdI+JI<(%ْX96l>zIj$fXtp\cpf/K4٘ʋa^/v8DD\D͔7Ye29RH\tA΂t/fok>56MݏQ)jQ 5V &ٹrmB!=Pbº@'H-%B`-ZCWW"Jh} *'HW :阚XU+i[*U*X]=ҒVU,ek*U-th1:xDٝ$JaV/sn۳ ~YLN:3HG'u@C&J ܊O/·~8vKDfx:a1sQo!ӯn}KD;+a<2HR;(s{UEۼn$;-U_XڰbҹVRE !+jS_3a1t4Py_&A#){SrƘcd<RS&T2P?j:RN5"ǜ;˜|HD׳u ds3|&˝R 3f'{Ғ>AiIkkr>(g+k7Re"YB/l:pz2!q6NfE=tbdzDeH\S"ąr-%= Dٓ.&mCIlYOoΓ۵qk*Ũ-tztPڲ^GW߄"UtB=UBKUBIeGWOmXUX[* RQm>CWLIFx*5t*J:]%5' p`W^W_N2ʀ9RXg1vYy)gz@βYZ=f-Erjz3fBFb+wH7*~^oo淫 _{WJם5[([=!))ajY6whӾ$;^vfβ#IvCNV>qT)y/!j"BJ R䨔D%uhY}-,Q3w4Lz=+N'*ߪ`TSUaLO + 3_~2RldEKY>+DfTXT`#)E|<, |8';ŘZ(?$p8ʞ}*+o-t]G{z}?SŚԻYWÝ\beH1ЈWSǬ!/X4RDppU+mtYCdb\k>Uee.8fhxjϕsJHA54=+ 7|cEnzm_TKWEb^Ͳwe:i/2y sZD%gKr_+LRsMeM.A.X,ow8px= MD׊2O! w@ ~v\AװXۺ!yQ=`?Ӱ&w5CQ7&ŠvsaB;$"ƚRs8aZqQz)5dsL(QmGs+:2 cZ)"VD h$ l )~P2'%SGS-(Ɯi#(bAyL-0 D`Bjz[e=@lX6[D#GhR c~ 3, ڎ\+%k }Hf Le*+lEf5l6My3Jm)AK+sљdb q& ͌ß֕}Կ(^<4,X%) "g\isf@8Zn^͚[=-EKd{l:(Jβ }XpBfK]\Fnڠy:\zm`AJD&!PW9ׄK*Ljd<)Vi bDDLH^{0<)c" ׍۸.hlGᗫrP:$Zf_߳yzK;죏SxNg}6,ږW˙Mw22j5B[p"9 \ßD !S8e4HHh{a$˕w#*AA5 RP"*0!a#M@EW(1`"{&C)$S+/%%3Ozg%0l\pCby@ Ϝ U,ywqhPfs$m_ 6|Ӛ|xΉG[줌^ ģqTRC oTY*zXIu Ǔ QZ{c56(mH"XF(a%∀Gb!阧Iq$YEzʼnpFS*j1@:+Arx%J;mirbS5zݴX?01t= LȆ&H<)pUvp#/c|4"TH<ޡ2HZ)"ւ9&LQKY*:ߞ˔1h`)|0[8;XnsaOVFng娟6Ts_鿣Ͽ_P{+ɛu6T]9'c_P{?vi)qЛe,2;'Yߦ%:˜Jn|XJh1?Prqkʑted-xxy՗ٖ?gfn~GUO {?~UO?xZ<:_B4chrY,&&\M ?O'`d>F{N.,A~ ?:k81Fk>6'6s.pY*)ځiIUy ,]R곫AO͟6)β}'o>"ִ|?V._=fk`]^l<^̀2>D3.;q܆c$d/ć=qɥ}_4gg2%-xf0z=_.o?MtWYo٩i?.꾢/8:'fQMϦ>¤mӑ"dI G#o3q|lu[zSxr(;+MόVz<dz,|l.A1ƘaTd՛RͷA{.onl|vO)>% 9 g x~T\#^jHU| ZXIVn}KD[+}wPlPQ2ҹƽO輸{74PUn^rrGePOyPDeɕ /OT/,Zv֩M,$M̤t3EU-79ct3+-9'PMa%EXEap{RCrw3>LCI;Zk` O&#8A^IsEh\$s`F ǸP W{e8!$.EsNaSƵutzۭxo3kvsUo8J)4|RXR{d|5;șcADEE9s,bkd.%RR89]P~)9l-({SpNX)*W.RpTD J\QNQwYp=*Q(*Դٝ:=V)nIz̝vX-"ro0CJf+Ac&ncpPnmN.GsS='17>qb6,ٟ?|{6Zrhn,1H*6n+$2=w8[ڭVtVRvV0v͞??v?c]Yui^UrЛ\ޗҼ.fכc e & ¼&BP5b4Mj+&9NxQ 9^$-L]fef| m< (q>-q(2ar %t?1uȳ;ۣuV$l-[}G֭5.N BJzxXXJQ,y`2TZSAlt (WKM)pZS=eCګüGg /4瞾BfdTマ/1]ƭߞox߼]kէ[wW=[h0.)]˘B 9G \kib(,1N{7Vhx@0I5Y ywÚYZQ")9IE$s2?{WrŸ$c fd0Ӥ) ^xjJ<l˜fOOwS+ t tt`9$p%k'⾗[*/-p]S8f݇UdˈdԖ,R!d*3jg#>YP\*{~ŞK5HwG3R]F:$8:"iycS$r^K^DAoB]`F3 W6 w?|mo/kʻ4(r p'w$0sDu`w|͊e-&M3?:y ҿ YlJoW㰅_H/g_"GX3 8)=+d/Vv9vTѨ\.oc_ڀ|mKU:j2Wk8/FSޟǽFǯ2y6G?l]0uv7]dO^қtz7}#L/I+7ndxU ʀߍƟ_ޞ*VvV,v%>-o/z/F d~-IxP|ϡ-3~_n8 8qCNn5rI ζe"g4<:!d0`u9jUf6@3z>հ$ >黫4<*pY&n^T.QUIlJyǞګ%˾x7`70 Ct۴%U5Sx`#VGaVwcfOdQ& g` ]rbϖE;|9ui)FONh`BEd\fn0t)CD!tG.$CQ%t *֮Fk_q{b|%u+IV6eC6֍.YYk>arG(\A!zF%zJyF)`VJ.5g싃ȎZ¼.bFX:&0[_3fνں r-&_H]Xo%nrqXHrjE$KZ5@H1w2Աl;)Bv5 ku[ACqѬMvFuE&;\T8(Fp4qE=V Xi͘Ǧ柣цNYW,wz"~)Tʸ~ՂVUAV5sj*M2N||G 3_Ɲn|>+4\llh(m|Jo+){c>*) }jp&ؑ DZK``N1.(ڄ )+@夒2XD*Rx1HoGǴ7UqGy@$y0}}x,44vzvz,6YW65 #0#20I`zPܐ>{o t&erpt\gr|[s{ҵûd[(鉩rd2W[n^kg=>ǗySڣsn1g@%g9Dl,˿cO&vŰZ2(_xYS|>ri+JЙOmZ}. w&xq_j>jA滭V!xYZبKxYH-gbO,=;\gUA \ [Hqlu_n.xIŶqk˘.V6̛.lvW5&?]yxu~1`~P󫃏ed4`T촢bqu|5Y?\ f=;n϶8rɒgb3-}2BaYo8ܨ\6)}(Hоp)NN#D(,mRq2'%F }4,Қy֠cznlIô}(쩇>haX;PPVkX{9>_4,-JmGe"pltRkpF)]ϪXsꚕͬ'?P窬c7H Ť$6QCn\ 9Z.d$U)l֡ B'ҌR=tTC*d y;DL,#ѧ,WsqOqy ?O3rssyhb`T4Aʠ{1VIIgAKfL"7)q3>6d&SKBÀ'P&?P3#fؖr2h?:8hJj]1l6(/CN&BYL?"dҩd)DaQ/"jr@iC62qQ94VH!+%hZ! %tWYf.ɂ9x' Ler޶'gqVRҿ!Qbh}Ű, S2MŠVauP1s&oYvuIk^~#JDQbZ(nNZ6LfT<"d MJaǯU2!)ʄ ,'Kй!3/ENzlN)@{R$c ڦH䬼D(=ZLBdݒRMVCe, M/ * +*vyU, 06ζ|ugU7pyx1|(PIdCv*K4%{'g*e.D#/FʺmCq|L (bS F%+\|;. 39bf&%vaMPv'ُ(qaEHs164G@7B$ 98Cq{\ShKu% d*@MHSQ9ig9ZSk9>Ln;ksufwL=O]f_.rҦG_ SY9-|v[6}ztǤ1f;jFOm{oͼ| k3VomygއJ ƞwL(4z#%ŷowN;'윸F76NY#Ue[Zdi&p4KڡXiiP¬$3( Rp&,NغJ?e1{[!7<6*q.f >"U̱ZD'mq"B8X 2TL2qR2O-e( |#̷y9}GȦ{~@긒,s9J@r5ZX.N1):8А6홷 άiڐb!|C 7;kV5f}7XyL8b:ܳUdfI~ vyzo+8::*h rk)& ]҈Eʷ = { ,(2fMN:Y!@ ܨĹY c9ReJ8hc(b Hʽr䳖=6k u|P}^re#!QhxH!RdGkd5.L5YSCN?0:MHNF24!f YJ(SNR=S+^{?{}aGd1C넥"͒d`$F:XA8hw\-ཐр2jâ2J2&wmIW &&`! y9F&F"r _!J6mDәIY"OW% 6j)lXXcQ0xzDP"iftg9ܑo)`CgtUq*Nx^siMpl`lk։z?Bnp6dG{zb z}EkK%Mek7MGLH&fZm'zMz5M3N ?Vs/@~Cb3Fg~wuYq3_o$3_̿>;:c o~Zș0jgT^_44Ramo>zG3cc4]f=YO)E H}C=}׮LLf˓͌UfbÚv7V/V5w?kj}l jm,|T2 O)j^ޭoޚTh[YRi"ryQy7`agC:HZu\J'\W3 \`iU\Zc+RԀqUI" L+RMׇTr3એ M`}<;r)R~F 5,M0Vxʣߪ,s&'JMd[M(y"]Ib.^W,L>o:"yR]rvyWmliݍ{s|o~v ^.On@~jf=ɶ'b]2 bh`i쌒h-*;B6R|{OMݒYƴ-ٝ1B6qip^3M,mL'+JL`_zR-ե K a}hNAO0`-Ïw?\*u~VN,w\;p⎧vjO5NXKZ`hݷk\ABbpErm1:VvWҁpC\qNqZA"++UjTW=ĕLAtkwWVTpC\I!.)"+kX1" zUq+lɵ\ZeqE*ŀ^J2cn~68(j:/.32`iD?=|1iYjnM{+Ŏ^~/ЎꠛݣqJk-A8!q)2; J7̛G6W귺y7_{y=AO?Np|p`1 f[V?aw.V; =~:B$Y%AۚS!1n>jhʎk9P6;ֆoK|'Ջ9,hPrNw~u 3 &곴ӝǿ;~6!/_^n8m_j򴵣}6nv?yavZQa[j PԊۓJɆPNZ]P,IԪη,Iz++o->HpJS H]4rUq\+,U1"~{׌:wWrU?qe^r憥ZewH,=CNq O3tvi9dM2>Q߻E N6 f'~8lh'WiÆvj:eک4-> -z۸኎.3\4KUZf+RìKkuIBbpEr+R]qE*pC\ }Qq%|+{߭ԊS-nR!#ڪp /1Hr(WVv1H*pC\)ApE++W:+R݀J++,W$بbpErm1]V:H%}ĕLDl1"ܖ+R+;?T^jn-W$XbpEr,W UZ\WNc[A j'>ևTjpG\Y`;#W]<]b9>]v(+*V%wIJΰP)}Ą1{xk Vq>0fBɽ Z[kaNJSOIo'Wx_;DzT -91[- \`ˊKTr5ઇ9k] H)WvW`B!BDbpErA+R{TvmWWEA"Jk1"BwWRW=ĕ21WHJ^ HHbpEr-/Wu1*W=ĕeL SHq:V Pf"|#hf`+H>J~AB\bprХ S'JrUqe7mpQu_\gd,҈~{b9טk)Y}jە.^^~/Ў11Ǟ΃?o&}hO|M=c-uД3Pv4,fW4۲)/U#huٴ縬Iwr $ Y%AۚS!1n>jhʎk9P6;ֆoڣS\9,lIFO(9IM7NߌUϾ&q&$gz82c|R=_c&?˵]olw|Xч7o^a{7^}J? b5 CjTS)0V3*irz\Z:?{TG\Y \`Yd[HNj +%+ W$XbpEr(WVw1H*pG\Yazc v0=}A Vrg5wNSq>e0@Է{תH6WVS`{VN+x6e;eξJѱٔEؠEoe$\/k'SLq*p\=S0RYm*\_&ԎVJTkT77ޮ[ W^~6|FtkP,~ }wڅξk?vIS͗?>gM"c_Ϙ?_- pZKohRKiUdzb+֮jZx+wqu[V?\m]RS7hrj_5.SnرVрlaF١1i]e|$x-6,i:u_f?g`d-tLI\9w1h,Fb~k幯Y@ɜ"KWtcsF `地DH Á8 S:x'SA^ؘuB5d=Èj-յR1.`F&^AF:AX$@s)`KBgb9׀֌!fv7"D^):fXFY{M2>{X41K'y\_uD'ɐ p3!kĆ0"hM:V+}y9zX ud~Z-KōStH5R DPs4_Je@6N!$A@r,ɽbџ@8jEJ1%g5ә︰j_ \la[3`4Z>0t 8jc±vdh>JҤK Pc0Yg`}[hLʉ}bI$1Jna1XuEd-hŘC?1"81aj#QeIatЉ_f2YZb<6V̈́1؄GE\!+):RVf"ۈU`^t֌ڬ0:F;\%yD 6ZeRw9a% S%<<}bEg1֘Oȭ@sZ%DFsh X>+LFT&9RЀufB .)a<%CXgj0[ ejfHXtf2+ҊF"a w0ĺC5ްes %#cl`0oEb)V` Y(ȡebAZeTykc[C!^}TlBae 1D0HMF1QfibtT?{Ƒe !,LZCM3`dzu MjHʱ'S|%Jjbtc$f[=N*)ؙVNecC[߱ X]_( 9lkoN+%*210J892r^cQ ԳYG"=doP* Bꮺ2ZBQ~ʺ@J@K&ѷLbgV #5"jJ()46 JaONXżӬ(aۂj2P$ 6}tg ~^2d*,hb N("f_]b =N D@6DK z1h"Mƒ]ܐ *]P!([ݢE(vo0!4Kr ю2*w z6W#3K:v*is^u@ؖhd嶍J/p:D xd!EAAd&Ybu4(?z2Z;(o *"eȨJcQy7( s.1"xሳXːBPuDYlls B:gR9M`UPP7o5J@Y\Go^"l039l"Q}wN%Ta>ҹd]SZ4lbܔ@i4cx6-52 u;VE8.nAt $8z`2o6:W iRHuZ5$QhS@q8RV,8j310| gj7LJȀśC|樇xy@1CK67ŹFm堓Y.1eB:G DYZ;[n0V6bXW46I2'~῅+Ip +ESA=@Q ;b}B!D-j #EiQI`bY`-\cG\ "eq8QcsJ7H0s#SZ}Qc |,L>9.J}fR,JLTPH#B5%i\4Z`~ɑ]rmn.>7@K9^uF @(t\F9 XV2С m@ >N3#k :|o*dF)0 8e݆s3FR / 9mِCLX[-:_MC,cd0+&ٖ "CJ,9tRr -0 YXcy]jFէ+Neg,!z c #jW崝mloz(q|@fŢm0rRHQn7.ޡ8}+_G2k$A8F˱ڢC]w'軺3M`7t99_.ɤWOKkc2]N7coyfm-hry8?;ž2u[rEmi9Y6]f|WN/o+껲 1uu{YZx12-炟+k*8ct@`yՔ+Ew'?yY ;/G)=olW2;f.f7q1.."ʳs_~MdL/dt];m0&m"[SM+hdV&9'.sכ:ӫ~jT]Y/&g88E젛?lxQVh{Tu8mQ(MmTb:znV"݆?%b6)MiQ6)Oǝ:n7VgAcͳd~<3[H)2@XnF$͞{\?`eټ^;F\w:Jz7- VTHjn8N~s#fd3uWzl~:C=.zzUdgxy]䚟 f̪bg3ߺ=R,>9} d~wI 2:oSZϾԒ|7 hUH2lzLp&r4kGdTqAmotb89Lf%\ngeiX 'g'~dEם~ +MOUr/J\?05rqb @f 2x#3{tTg:Z\q}%\y]a:D\ƦkkR O4vK37T?w+&x+_4YZ!ԇ41om7ZBmJ@w7e:.}mo&Ct͗[YC-,(UBw߷g̝ӯqG]CB;C7K㰴@px[V-h ܩ'%ԏ%"C*E=֮>~$.P.q^%[]G!7]c5 jն\myy2|x `鵾7x-p۟`h?UsjzJJ(^Zj%s!4ONF/aszqrpc;zvץ~#]>:xj_fVl-պN^e{']3amcs}}C܈\b;mKbR^#'6:`_reκ`ۄޢ/|9Ea2W.7^I>sv&k<~z9|2uLU2zByF缱siwJ1EYXe)Rmeyhc2x¼oNo?95.]yL/_-Y N}S.7z9mҟ_{^5x֦3ly MO8亦VuQ?? wϦڄWWYgˮMxmKWl䍦-f駲5`եMVrSS[ jC\^W<9]Mu8ͮA\f7{{W$w9Zd{OG? QVMU4Cy=6}~5\tRM?|^ $Kon|[[\,i޿m,>st:N;Ja\T?})7 sVUg|qwVm_Z儞ma,~޷˰q;k>00*h$΍w<7?rmr_2I EET2~=EZ][s7+|Xh4~d*g7S6¥a3H8(Yh(ySp3F7u:*1vk%WʆB쟧dG85Ɏt$A}i/e,5#'I$og'c%RT^; ʱ?XɷIͥ~] hU0`4z,SoR45yhoWq;"A%`45X^/˳Wja=B;> 7%ʮW|7?_'o;ho;- +Q ?6mgﺭ/AًB緪{s+-MAJxu7&lCCJ!@6zlɹs䣷k +NalAu<(!vmL/;i>iBL.h?MmQ.hlrɘ*`3H.,cAJ1BVT/E2ZR6(8B$E/JRF"߀EP_jӬ;{la =p4E3< |Gp0uMT{ō76MO.wke1i]{)g~۽[l@h۝uve iL׹'[Cb`R޻m{\Z"SYd >Xd62yV1`Yf,XUФ"2*L5E(2&Vr$ϡ5shӚJA͎ f7:qߞU7|E5>9>f0Pr*#Cɂ--`9eb;튐NM%d{m+|wVwY i{5{Mo{}@Fip?-TG#DeS^;% B J?ځUNv4S;PAZ Թ !kXIJ B&T E"fCN+MP_$ },9}Q`*F)H;R1ZXZsfݹ#2Bwآ>-; aϺS3yyy_qd^RGO1b: 1xGWҀlEYԝd:3t]KVA,{MsrdSDjd4:v±Hcz5!m3X 3H '\dQ&Iiځ 1)amkI%#fc+:T\ fQIS@zSU;. ߘ\7Ξ1b:bP:%~}`EȵI|݋3fCtJ uWfI 0l-ÄEpD|j9Mz`!5aB;;HٔPI/EtvBү6YBQcv1O>yYO# b`Ӭ|wv |G <>Bdoc.! UCi**5I4ziVvxa}F|W^5=l^ø ! IDUb*Uc BȫRS=fP`P׀h,4]LoT@*#Q)n`rw-ƫK/_0y+Qث\uS+o.d*rt{qWa5 Bf*o: ӻ_ .SjO35{MxQ_0 Q="'6Z0+a–ȩG9Cb_Ljhu)g]z9>6G"urJ=.!&ji.j7 @ @#!2=xQyUrJͿj]lCussb>Fb8Fn>{;zdU׷}pt3\e8-={R:Kn=ǵςf vja]Z^;\**?[;X;_Tu`2BIҮv4Z vApMWrXP8S䵐(+#F R9ǝt! pS@| JĈ["*3`y!fmrbmH_Bˑ>>&D>rDf))L (Ж*D [GSku>:!QyIꗥ*1kT4*c, $ /gZ3&U<{gzx<]u{G; G01E1, YRhaFWIzE-=8Q1ZSS-15hj Y#l"pGg8l#pkQ*M KB\)*HӖfj`"R-_:zwl0Gύ_^& NrCDQctH2$.jzӐDI)Ou9Ϋ{uz[qKτ]vEol7C#2 e-嵑Վɀ3#lԑSqРnGȩ$ve 3Ԋp& Q$G9[0.3F&jAS,#)j}Ʀjv,g[ez Q_(tm"0n!HqN̂DR0)PD >:Dr\YgTtb44%`B;g.@BAFR>E:Z%$@ yAYǁ%zɘeSU{ 9 4 b ̋' $BG `S!y%$W t.Qs LjM-YƆ܁ǹa3 |^'M zvv{gK93͒\4 UQx)8ڻXN|$,>TyiE,hqU&R'&A<@2|0,Jc# AUbTykA )æ c]|N-:uhʹ i֒'/4 :!V$0BzR#m#@۵qpBq4>v_`*;x*ڜH$:&E(e!M;"ͬaraiQgQס/TS6 `\h&xɳo J$UQ)W L^ G}H=],td9(i %i^ReXՒ%&/PX%/,2)\ZrM [lÇa?(?͢G& = Q==m7W?罓O8.Lg*pГ}#6l/?x2"WlNň,ܪH<hg"3ުLNQA&Cԗ9g 5AP4U)^9nzrX*&΄W; Dԃg ;a \4z8L(lPAuPXGcMuY>zWXŒ LܾPKpͼn|-ǫ>R,sme"BYvJH2jNYp|&M"MxT@ficX _/O[gg{YSosdX<7IWSfnLͷp9=ɱxG^1GťXL:c`Ca |hi+X1q4*屘,ҁ,"\Bs0B\eJ8s]e)s ͕J\A.M'iapfaibD`C(M5ƻHz07/`|np[:P&d T |SM>]\^_q;ީ ՗]:(?ˏߝHMJԎ_ acm !QMvEq]X2Na,,%d’Y6|1o\FO^k:sv^@qחE-Y8 `K7ߟKmcs'*|2}mk6NOvں}Nb퐖 [6{Q j7ǪStH&ouzMۍabM;vںg.:GHB+ِ-\HIX*QDWA\0I ):I=C*E{5ښΔbGbNA IT#웁7q6Oa<؛~jg"NyQR$U2oK4cM_Lo@l$)Z ΃**yI3Tx|gq{gE9dtHkiɮvlM.hJ8" hRN,H*F)4ʤL+EԾ}Ciǎ~!1 LXG '~Vc#.ƼIfz_c*,x]7Xړgښ۸_a)HWU67*'/I7M*$%I忟"R%J< טu6˽V31C37?љv&A׌φDqo _b ~6[J;2E~?4gyOsagjyQm5M{A ziz1QMl R(&j$g*q+B,IM&i֙;Y)XTiEcqwH+t{NhF9LGR 2n"SƵ笁WҞ/U::Oҽ^:NL_gX{:<!]ȪTcNgj8@ Vɻ|ǥ*qw҄#)Ba}K'f-7J]ͱ'-]Z}zT|td>s^@Qc:mkx)?>!ơIr,,)15yY)RI$Hي7"zl+K^[~6C3勨(Qkm^`QY+& 6(|(Cf<22YZM)y Qca^O;ƜF$87^IO0 EQh:m%l-Fȿt3|E=(˟LƮ]dDW$^TBBpNEL*)3rUjnc[M)zv (Bh, !6N$F@)ye\4X,#v 'ne&jR8a1u"+KmJƒg;W@q tzW*yRz͓As}EA=&K*% $U&UPaYsrGfblKqID JlY$5¸Imt0Ȯ.D;c}ULkŢ$b3)YzF NjL&I9`*i,R$,pndMGbs H5[mknEV ɥyݙȦVy]51FN %2:ɔʄSt" W65NJu,>תGs1c;Әl5}gXVSn2~%j QQf m#%82 nkyűG}n+M@ qGtlZYa̕`aܥisU. ͯz )do_U|b R|{7ܹGT荒m -δVrkHk\{:ol$ݮ,Ӷ Ҝ SVB-z unLus!(U"cHVFJuȡgp.3YcI4ƍ̠CҤ"%0V;c@(#P`fnroˮ(E!% 8E&7d&t)MLh[EwӤ5 _abӞݑ Pg4(笰IQyNh Vȷھ|݊kV|Uy#Bx=r$L?xO8σ19:MTPXDV^:KP}TIdlkx1<8gi-s<)>Y9FRc)yV{(:x.]y,Ca4by \+rtiL\1l9[%t@>mv%*@S\,VeRc`ܷZߘ_,JBY ]4 SwEMGv]{a;t!Ԅ/ ۏ㗻|q|_w~Q.N /7_?ROY߼>O BH?ƄT:O[˿DH.7ͻ-9} LJRBǾ~uzRwԂw׷ئf 3غ RuKhkOz;8s!'3WW?k_hxs0.$%8^cy 8iSƳmůi!u0n:u+^4 SKy?؏ԢN>~dznq;KvOoPx; ecڑ頏5vgQFic&^\.jUJzr1p!HKߧuWmW[lnHx{S!Xܧ2&fDx]7hV.Ԭ?m7ҙSU>v6rJ-tX WLд=/}}ͨM\Oyw,u4;oQ~A|wfL҉&V)UMWawYO6Wx.v`nmJ瓉+3Yx+%% >h,<,Xծ4(>ڱg/v,Z#,^))+6!D%d ̣E=O5z"!U;U41؈}y;70.y[:_ ʶ竻`]ϏYֺ]ϵ'~';  Y% +a\1nxΆgmxhƶxlapNGf`!AXQYPJNk) =LX`F!(\L>r\69":%J}`VD x!7 q:]ޡ6,9x:j To1},JJX?ŧ0h к `.zf ūW ADztzl?۠1}H}jGט3X` }H43s#j3M>"\M-=խPbwr!S,`=J䔻F@wo_BAYY h߼z^gܻ{t^S.pt+XΥ`p:To_Oryw޼`W]{{?ԅME-)p=O^_濏;Xy.dחzmnkŭlf:_4Hj6/:)EL}[L&K }9RkOV:S8X>ɳۿzZ}1u_6KM瘑4,+DzdlL2W MSYl&qr tv8Kelg mؗ+;,S+6Z6ZM9k[]xl>Gk4Ȍʈ26[ݠ11TZ*6$-NrtґJzp6Xwn_(6tg;d/ K7JG"2 ~s.-% m̊NCn_N~~_^7ZoW$9%e"Ωh9IY%7\ӊJ`[mLswEcف08˂>s6qiX Jr !hD <2r.8">z SLJІ,6@q٦d +4gugM=gf5`>.U R;GF'>,"2Q{LFTJ > ItMs^N?g]YsF+|L Ua27UW<͸\ݧ%Zɐ'5}N J\$ HYpEK=f+IC2\DS"S"J:#9ag'uvҊy۝#u&\y`ɀ׌JLQ9bFę,G#Ui9rl:~}ZU_#iDyRJHTƨӔě$S5SՒv܉z -|a[݃=|h{HJ8©"$T&,w,1 -gh VTȧ>أ9XƓ/m(HY{e׳X UܻalʰQ5éّsйn>DaɢVp‡ż,P,k{%MB8 B*rAw ^ ?:zV:&ƺ@.`]!ve1>KlD["Z0ڠ q |n?_.by]^V8P"zsla2bz_&R{[zgr4]ypx_Ao,+0[<15E!|WvH뉐(N6J|`+&)8e /Fz!ipΛ욢0"RrU[xT6Eµz^.:UKX@%=dҚ\BpR+e <,H`)!&J<a U[x Yr3gw)#/!V]ltUG82t1|ظߖjzO8ͧOfEtI!g%ISW5Cws<˴jNPRQ뼒:S $iPTĒPrE" %Rh$xV=Y'΃ JME'>"~D}kѺx5<; ɻ$mHSXBt>>( $I$Bk )nQ XƎՊKk 3QhQ4%8ːEd0@ޙ6t/LO"QG]|7ի3»E5~z Z`r 1S׆J /A|qEQ*N4#xEDNI&yhK2;eDNFJ N4"y2\w0t~cg'ϟd >0&YC"sxd]kpILzr7Pq}c\_1vf݄;x СI=5m büdCx !j=l qIQ T3f[ K%^MٙP/+W/0C+˗P׭]\^7-ޗ2gUTwwNדڥ`x=?~W6XM V_WYjU7jeqߏj~ٙ7w| x}.W?}MX]ɳ9S^z\es2=9~0_HW>Wۚ&eN #R齿LNBg/';w.)9cT3jEaBľ`m2|c:8%'SR~;;qyF拸ń_"tJ~۟F&̏8tTuyYqZ& Cc|=-Ji5>T˻ȱ%VUΆ5V;.ڄp{Q֫ba=P~N|Mv{[sv!J*\T+~:?CKg'v.$9_o")=Tp٭qߣF`^=aiL]fjKOYI𙋺5x4ӥk Gޭ1 %2O~^ 4do^zŵ?'o-'}~,LUϴٺY?V | rG}f V$ZC)r({33̭d<#BC`q>ͮ>k2#&Sn,JKHKpAqTJ2/ j. +4`t[ugǠ@6ӫ1jkVR՞M\Ց&ˈ[T '_Jݲ[Vr!?.)G7AoE&qa<:kBt8ǩYBד,̗\L`-ҴZfg=hNV=T+eV,R5f[uP #rTHLiL[ ݦWyTZt!9/qBwGEasjA 첮m̨:ˬ>;6t+]d:\UO F@vvU76̻}tm1قL̬"ݗ=-ς.L;͞N$8=XdW<Ł–E$qٶ4LZ79S!hSy;k˩}y. ]4; O+EYGpP%Nx04@`T#TFOh0O#tB)yѪQQe96c,4 ˪XPqP)8^Rmpos,L;Ù'9e"QW@t51Z.ilp,-Q;vlGY'*õ G*Y+)H%'9"W#נ-'B\F:0@%X&Zg\g}NSV2W!ncӹODW)[d#(ϯ&KV[ϭʯmcfX?F1[ ZWtnE޸:Um^]2+Dpj<*3yDr8ɽrIUXXCa4RAYV}LSV@MVj,3=}y'Kt|bْepvZ3/.M`\(Gu.&eJ,hRF,7J5%T MvIɹtz>o,Fm3aȄQBJyB&xei0gҝnEfL `B8 <+H=j/Ϣ<#'\>9ESA*g P'(EDjw;iT Y&fq.pb IJ6 HJ c 15cZiS^M0J-^%]xTz/H-` +<T hp1@2΁啐MTO=)V"azFẊ:l]9G< N ɐxQieB$H#.DDҔ!JCf*v(ؐmxԌΊ4(<*ޚOW#"Y]6G1 bZfW.PT{C[׫U&}iFp(t𳃟l)d b^r&* D&( 3gsa];Jp"}؛GаNj@G4[hFpd(1,޶|X)Mek^X)>?y陦:rKx- yj)&RhjyMTZT(*T&lu_wt>:~]vG{+z ƹJfBpYxkg2f4$=DOgԑZl9u$m1ꎜ'Gٙ$D4)pN>=Xc *%x#(Pўtv)R"49V1J2wQ E`n!HQ'D ͤ@"LO0YAO3 ت:ˎDzѣF>/!-:iIZˢV{b#HJ*p,Q`hat5Z 7mЮ!si֒'ڠ:c6єj De brǒrv83Bne{xRqjs8,H6Eq*L+(x" 9ю?j#9a:LrPf"cdR%N'D”@pmRuhްH8ӴkczQNYM\S |27{a{bΚuMܳT\~/NpJ.W_ggNr'rOE.W3LJ]XhY ^W^O4_!}8xse2k&+' 5kf-urI)>:e2ڋ)7k&gܫ&kf :ξ"%@O6/b>2IjQ:'!oJ'Nm|:QmU_*ggGpqɹ:/C/χЙ~GmA[;Ecp|KEnXzʪw TqEbe6]w^|Li@K§ Bû>q6oۏ2b Zzvn/Zohc$#kc6mo}18ۈXe. 2T MVt1|i> EjIh_ԬZ.WY(˱q!˩tQ-iUl(zջLv k{n:+ӧ\Ms"@5vkG?^0lwHEN`n&mo*7E]i o\a hv O<,h}ɶhL9k (x:-'Ae}|s֓#!Rœ@J0XhrD&j͌^6h&„#3޽꨼zk7`3ez_,̤$/+[ڕ_NrX,Qd eJ+gI127o!C 8:(#gJxnM{!blR[Ad\O@uLrR\ʈU Y@S@KRU6<~}BFrq#/`B&|(W*Ʊr줣W,I{rK ,?^kn~99$%e$"_jY" >ifx'$=GHb*h ^$ey*dnT3<?B1[)k@Ha* K-( ۷bmٛ_׃`q'Ѿ˳ީ_3"7t5cCUQ;dktIzv"+Tg\k6}o
a6.ilo"V u߲#3n|tzV \.6>7nM"Z'7phOf;jN P|LikxQ2ayIp7hﳿdկ?p2`K.-'хXvYns'r &uD%4yPkQjcIRaL> 'FrbrX+B"InǷ\6BKҤn[RTӚ/jĻ}_ ld[>:+6+~{#avs>w'lAMqt&FUe9%侥5/\K$HwU>\a +u_UB rH# be}0匈IkFрG!eLD:p=Sqh=mP?&v3s{bZƯMߊL2j5B[p"9\ßD !)xiI6!BvDHܲq 00x|;v 0BRvk L:ovLjCF8"+\ XHɐ,>gXGnDpCY,-,xGI4 =Yo !|tsδHjlrw[&50m99hk'h0+GũeiBTRxr;=7[c&jMd(EkV"h\,f:`lpHuHxj:ɂc7wa;T}_kqٛ\7:߇&&5rl<ŧp` _ø: $@*MF蹯]Y\}?#Ol#G͡e2k҇0TV 8{a0)/ -#6C1eZ_OgS2oF00i>ipܨf)F7tŞV#wLXHA==Ǜ:l Eۉ/Fd~׬.Hͺ$e 5?Xߏ]u ֲx'cxY*];q2V3 rRG܌'8MиMi`=VLqMM~zzhߎ=p탚BXn3)7?NFN?: -Xkinr(- [Տ-u3l?/:iӃЪ#2UakmQ|y3=~ [!$Ur+<)ƽ kmԦ:fٰ@{U#Ϭ3h;m6ik&źo9ѼȲ'"r5:S6Mo۔~Ql^υ' 6H/qRZFb1cbCz2#K )_ѫ#_(K /sA^IsEh\$s`F xJ(0(oqpā.^U܌FG.x땕ʅGUo8:WI]+IcpT<5_LQ\EE)!+YĦ2B"]Rp摢x"Ikm#GEOXt"Y4;pg.`G1D= p-޶ܲe$-6,VS,VlP-jlJ!lB .9EKP>K9k1K|H69!1 F1tN7PϺcvNkl6m.zH օmfgbzRxN,Ώ@_ƣf81]d1:P,4Qg6Ƌe=ϺG$KyNN'yx) $x~^O:Y1%i*?9BX{ǎK9Bx~~sr=ƨ  W\e^&D'@)S܅?I^83]uڞ¼Q'? ..N0q#sH6 Q7Dlbl# J1LzɊ<Ӈ<.f-yll@Y&ɈEJ,1d7& Ũ$Y\v.`=|@WⳠ]?J m=w ` 6=y!Ѡ՞^hJl=z c9n͎8OquV%^5xI4I@'!,:?} RAeR!b|fSDzV \R0 =`8 ?ni4Me|7sݻ۱~:7?>Ʒ #gM|;8Pt=DP+ft͉Ք h&} ҩt?Q0|Sbe7rVez0Y,hE%<>Bcc\ mY$2!9k0E0QTYuf;VvGxnps}>+ϩ1rN\d^YÏ,&B.Td3)Di$Cp5ٰm)LĂ䥴HC@x%2+ 9L:*IWpgٽYo`_eCګ]_}mq scn=|!fYcJ25bgP Rs[&P)HA}.EPL[)O^ ē<GC X'֔ VZ #k! ?|EN:検%%Cy9<:) jSWcصsD[XiT{Ċ}{Yow❓4V'bP[h aB[ i+LQT {o+ Lif֫0GnPS1] -l6h /!Ol#LUÏM+X#KU%* XC-5v,*ct-3%ƥ8{4IjxNٴ][V+SO8UOWj M-]`.I~RMLK?N?o_ߞ+ sVUs{E ^jZ}]=މEiSA^Su*:ezbUg\hFm2z4/]|L EϮNeT/;1~nҷ:>U9eGeMm .}3}]{߈S2 [0yS߾_y񥏃䆮OO3 $SqT襽?/5|ூ ~C]%trßGۘJeR6.'ˊ}Rgܟ exhe:$y0/cH3iK%0l/=f>7q~7~5ζNͲ7Yvo2Fjo1zk#2vYWu=x4#k$݆vE-dWm"󧫐 9`qpb&7y|;eܹhSd+KEϪIzlU& gad?{\>0}!!-m|sw3N&]{XJ[Sr&QcA|/z!] | X_ em6EYo1eWsHA-Ee={nhvqޓƲTdI"2aR'{)f$92I)Ie ߠdꤢ:u6իͺҋu\bL,JF/2q`lO1H]0PaHhrZ>së7!ބ|z ɒ|L R)fޘ )r"3͊o:IG|ٿ*"R6H[b3&!XmA35O/兊O4=:z!t cU rp.M`()>qUO|IըmS,LX' #S[ ȺĤ [(0$gMΝ"CG) F{"u$U9{W.YׁuQَag3<"[D6FMc-O泶!i=nTp{vvOf33޿u&!]ۻ[6Ժp5 tm3Nq7o*V|ww_nwoos4'ӷfLZovkyyOӯzv<Լƛۛ_ݾ|}d~<*m.Ь .Sfey-۸ ќ}gaX5zy'zr٫̭/ͭ#s󐧔1 [q9t14X+tAS)jLHؠ7IXgp)I9niuBi1LH(xI9JFwgC,Vg%' Sq鞲oVo`{{Z׃KW1DS6BN5=YZr Xa}nJxP|"SPF֟2d"HX]FsS׻]ugBOWfRw:䠈uG#Nm؝WiS;V]2 IH"$'ƂYwvks⩎'ȥSuv&%E߱^^֌B#xePFgT* DYL r׋Ћ;C}`#û_6 sG?PdedCɝJΌF{N>6oޒ=Qws,FVG﫶mB ֿ̓gI]zY \DžH@ũH>1{XS:z,ɹs4*εٷjtr,A6 ұu!4%O줇<'|%!X%f% 7.ђvE %D(p͸K 5qIk0z!RNx)]4Sޝugv֡'/t=;/{!n%qrn5Q|T~pc)dH%K SC{QFQtLP!l6soo; A"wmIεi X`lAv6AOIcԐl՗HQTS6`[mT:$ih%D)e#OɃ׍ s,; HQ9jƳeBYRlR҆x깳XYϪugO=;{S)) Bd A>so#(uT%УM2%>q#!PMj[N޸dɐ\IȠ=Z $ >dSmcJ$q*XsA06?I-N5c@u&1d \8SI3dQJ1 |A^4QpM5x#|6!:ORy R _G*p >q6ihcmNDo -K˓^MDU>[YVn{ F Cc*̄Sx:lI!l:XN?| G=XJ\SMBjdCOӲ7wME*D GY$` 6AF-8^Me(D$Ci}׈FAuX2ZYsǙ.GBX7K)Hoz QGkQŏ*"vo  ]h7(n ip n{OVR 0e:v0I@YwJo4qÓهVQz{ ; gǶ[B֋E?g{)⫀?n?%DD7\ W)֤>~VK.=Ikh)` w24ŅwS$0l{\ X> WS h$ӆFOii)c5/k[V!ܨb4e\ N _W%gw }\tMw|*J_+^2i92rq]#=2;c`#qrk#cR1WCuIPCU1i4ds,N[OZYd*6A+c:Mu|J'C3ϛ/^W]/oM'wvm:R]rԼR3}Gݠ!S{/}tGݼi:G*]yB+\r|_SM+_ =j^+[&Owߚ>29z.*wj6DVQ}ns\7o?%dӯ0NO@t- ';X&VdLHʢvL 0%)~QR5Ųh0 rͳMT:(ᕈۨ3vV2 TպQ\s1OjWClu L[i 3Z"w0ZFPHNdg< &YT!'=RijH $  /:`dk")7\FN6>p>fYߍ)d5a`DRIdB O NvSeX;{HԞttJgդPh+Eb;<a9V.9C+#&ɬl2NCǰbIǁY]{&> *^0]Ap-䍲Q7FY(e}blMse}ӳFY(e}oQ7'5FY(e}o(e}oQ7lW6FY(e}o ii  e}[Q7FY(e}oQֿFJh!Ovt!gnΛ:U+X@ΐ^YKJhiM (CtR+ep@'o0& eFXJ%slguΒJ8F^BGHF֗ tx~*׋i[v{ S`ѱkdN\ӡW.+y!uVYK zGA SQ*g hO6&ОLBjwR  ZGd݄<(%pDZ"RzY"+!d,k {/J3̵_W(:fv:$WS_]yMi/^߳W @o 2uuGӳ~A!MRᣐ5-.ϝޕ'S9` ĖSΥI|dr)ؗ(31c¤Rr!rIXs1 'NB ˚P`H۰ЧaMiiyn5˟xbV 1] ALc :uk8 /5HhgIuGUKd-s?*T'6ނ/c8}MsP\'y%us&@*f +.1i% AbC4K֥?Zh w tJZ1o#Vv^}<#s~oryw ,Y"x {WҗJr)Iɹ_d6.Yy9X`GEO|?ab! Sݒ PJk 7I3Ӂ9IEd4.UWT=ѿG-7iy8{$܅ 绫Y#z-xkXA"a2Tt^mp_O't0B|iv̻]x צ_1.B=/}]= \Y\8[-ѵNF,s.jry{ۋGR]z% ZzX`wʨLOy5 p:3(}ґ_4Wh٩{ȲN8h6۬'PM?۽$Wo/vRE8y ]oǑWr6⡺ wd#Ba p ZkKC4'wrH.GyUw=WU. ; uZ2BJ[1K%Z/'/d|6: W3=)%A+`%FXBS"gIEBjNb?n^@I.dcC ƒUϸ:zQʅI:'6+c?<ޖMHSe5q|JV[DK:=i@N?hyȍ D3eCb2+)#9 6-=& $F2t qMka5$p7t42Lk\ogrS%ckx>Nօu31G4x KU_6ެk`X2O䜉J#+[I>#=I"<PޞO,S-^]GVF"N!2jNQ;QL&X_x>']ZӉ#jCG뉜.?n/&G=)JhIJ,= K-e=/EWeIM 81yj\BDeɐIU HT($0p8sxՎ1e9:h R0$A^ge-_Ȕ Bq}6mN<5O|WsZزF;Egʝ[P)"x)$ꃡQќ% [ (l0NDzWd(`6')m,s b2xdJ ًG{|E]s/-* ׫fM8?#l]Nq,sieP̭=e1q'ӗ J9Kq[)(zݰfg:.  ʦI-%q쇺DԺKyǛU9m^oU:4-sS]_E4^U]`ɮZP|]}[h|wE2(c4r?G?ڀom>?e֒VR1AptyvTC_7dv6-Jnqo߼^gxgdP F1.L[Ӎ/'X݅kohqn"bk.nf 5[7STu6$5WhXC$jy/{T\NtU {41rAzW\i]XhoLnlBxVT`,mDی.g[FvõK}ymx}mPMU复ȻٰfRGov$flW9C@%V&i2P#HTFN_-:ԓ&2%>{8k_V!O< I(0t@=Dc.vEe$|].SOTRg2ޥzI^$"êc Ӥ Q0t4gʢcRhWƗBO8)+"4Vs"J-kz^ޒukr4+}Ak.U1EHxN[ზͰJ(hʝI)NeZL8.h " 8ygLD< 5Bbv١Dw }TJT}W%TvG~"%p)uV ]U{xDGO1>: }{>)F_:ׄDqv/]$@+D#-9#4JkM>\f-^x+vtAm/PƤO?av5+~وk/z3.&[:-O1Y;h4v.X HL%wƔZTjb5<[>P/v}bԋ]oBEͬeƢ:1~%4BU-TYʩ`$oL[^|WeA}4~w}h]gj$Aa}3*&v攈xI0^aG: ~]da@Le6e+Rd8ꁑ|_\_7Me{+ !h*[MSEVk8̠[_-.t6NjڿXmxf9t_PخFTQ1E^ ى`1%i_rnci !ցS& Vd'Q )|-Z0! CNl}{Jmv|" s`=–5Ѵ}'6ψh @Cr>Ռ!|xaSX)d4s~WOe-0y(O50Ei`ys߃^aD=`ڒtwDȣ .)ZSL"%c̕YR}^\݋ȑ֬9wI~aG&w=%2JkaS>֔N8LyNאqap!=9c-Rq7:.#z2Г'{;5qD<4~G׍4cό4{h|ZőYT঎2˺Io;:Ƌm" 7y(_]x 2rtp_zM ͵aշ5iqm '/N>qhF> 3cϐ([i}hgTBwS%E'娒D%3z.9fT&0}z܏\2Bʠh T38ŧ?ϑ #d oBc q'|Uri}m:V4m|11ۺm([W֨ͱZv |[nc!s ճG9:w^;yh.MN9sbމ}.a)AIUFwϚ(>hux>Q?uQ)O!J"3 E7TM[qq2TH%z_sq+Q~4a"~Ι;@UZ8{vRzo9Ce%, _܇`F ~3z՟|qO~2xQڹ=m-tژ]˅nl,YԳL8'5E Cյ h\U ^&upT֓M~gwzm~ -oUx4}W\*}SX_ݛUZmlA *4-POplc;('x5^Ifdn+J{}_b *u.PE1Oi!@M6p~rrPeԪm(T4r8@q,YìN*Ge+8*9i6DD&K) უL$Q<$nj3HJDōm$$:Pm;m$)PR (10iJ!C'Q䝵Dd$"р w:Cː!(hf%\ch6DRL88e(M4tj)8Uhaj.C%v>"Q!%7IY$Y O蘲xgv:yz){`E{/.Qc `ҖBvP=5P$}m.5ˎaJ$zq@nj ָh)4"^PTl@t[ x4shSs#`BI dr(ϓR)JM$Fg-K75[]J3fB\hZhhhC_WoTh\s [(0k8)4:jEiZuM"*P*JvTZ2ɷ Fԓ*eXG32?ӥkMU * 2VjVa ,3*d68+ 6id)6?h2in4c<ಬP ++5H 4[A j "<(f`MFn3%̨!2xk#Eq #BlH `L1B{P4` tfM0(ƍuT&t/Yi{ h*3[!(Q.(ʨqƂXZ@7iᑼ# 0|I[mHPSAHvͮ\F)^eW'E]%TLVKgT0-(^55$Die"Z8(VS Per>!ʰ*4Q]k?|p.i8 x\}mݵb_*"uQ401ƄULl^v')A"/DaUsa_M>m9@CVLAG= ]I4Z*#``!&SAyCs ~XA̤ƂBgs՜P$rDMWD&deZP>6xOy .NfYLZ[]Qx nGep6m8f,XN5~T}%y E*ΊfG6Z/kI|ǍU>}m7:ޕVD>"CO,!VX(;KSvpC*Kt]J&!Gu^a,C)9i9Ч|J`@;뒄 XOAK0ڗ)UK F;zyg9Y貚fa=M/ms1D TM`֣;V$FOB=V`&?;-,zom5Es)DzI Id5ҫM`YB_HrR{R@c8nJxKT]жHߢ0C: I+矍J%Y-P2bX: u@bX: u@bX: u@bX: u@bX: u@bX: [)Pg= $x->y u@ߢ4۱u@bX: u@bX: u@bX: u@bX: u@bX: u@bXn/W1޻ޫfmiv9xZ]mVrqp H3b~FiViQڻ.gLҔE}1϶:Y-wt_+roU+e>T$Ǽ M+I/S]"au6 v/Tx1m{ H-N}Q(q۷3]-݆v9WyȨP2`RtFhePG mY`irpq?cgc4;&94i`Q"[xT/h?o^ ]',9Wuޚ|#) M{0۹uƲunb&6٧mݾChf&էBMٯ|y9k$LSaN1]_| noCMr1 =@I!9<tuXHݙϤ6D"e=- D8So~ Orq-y򆭣>!E+%<:]sycյtO˺v_?S7p幫1 DVW8x;O]?9-=V'?DxD+WW:p:1eo셇x;wlZm]j}VNsԼ 18ď?O~ǒ݅o=ҩba\,aXiTT JWU]%+J>w\E > ,rD9_؜xB|Hˏp#v~#ί=w9'xw#@xT{ڌYfo渧L{Vv*[hMzF.MMZVU^b+Me*1@UR<$Eo_C^ O)w'8нAo[7wM<);H5H7hc^FңS:JG?{D 13Lێҧy03Њ.կ7"s 1;`$Yvu<>m ϯ'pG v{|}$S&R/S#AS &9e!;_B3;=״JPmSR" sUxS2{;Ng̹re U-6D}wX:nyPS:?3yR4zEC?TGnO5ѳy^ʑ[*3`m{d)^PS*F[đfDeԅF]mq>6r~}qvdA icԥhhȠSVtUeLRIl-&92qWfkG]QqψcB 4.F$IL.N暻ʱ.:"6Ŭr=qL&#.iD{R*$\YQrA&jUо*9taItP\1Ob ǫb;MQ.ʽrwY7fqG8ReK9ЄB\H{Z?P}sBkF iE2Ȅ,1IX˷LY&1c)W;xk]{SRCʺX'6&y)C1"KBeFs mӍ6ЊBtd._ᑫ8i:|Y6@Rm}4wWi#9< O%:&FF~ kIg'yoύa}ʰ=zeȫڛh;`JQ7.]jA/6BM4qiWC:C?{Ƒ";`pX [woF?%g=(i8͞~TUqZX\vYPY:Z:w"~ QRFjڮOlѥ1uѾ\qEB LpH"Wn/ѹƜnP>()ibG=p,ݣ@ÎEyʎb."M2(XЧffmVDO׫n#y|6i@N$*>!A7!1lؒ.?OE*w//1]kJD}r;.Yd:)I|tY'fnytZ1QPe]B) n$8YbuyrX]|C*7±t&тN  + &.=2@g!kw!:M1סnp&%M_CyzN7V٥v_!`<[T6hO!Ox)#284&^5r$ ņTBD\F$"UЬGO- oI0FB7 2A1`!+;K.MuKDrAfCEDЄgM$k REwZbRiĸL.rw-IJbw7uՖӢ~}8J`$r zS I Pź "eP*^t#x&eEO%k<g]&cUIP)"E@(+n"$y|ΐi{ӉpQdsOp=x\$R+<45;x&@LE ;agf]-x'4eAN{'W}874N/g~be<3dp&ӫ8+- 7qlײ :ώ7gz'qyY R b9%R> >'ez2?KaϋIctȶʙ瑃)p/#H)]qUI`N(CO_6{tPƫɭtoGq;~fjƜeܓ~֑K4[B /~.D/DI8Jz>̾=$eyΧ:*VNշi\=aps>ʉmj^lVZ2aLWȲvAYrYl绕' #/K$60h9=z`)v SQj~C#!,w'.= 9~n!?z8Mq FN;7[lBKFMq3gG D|֊g(1Zm;6x3]tx[KvCa2[Ҳ@u c޺|*2ѳX/Z( .M;Yj~DvnsݦY! 8}𬪧g{)jr ⮩0*Q<BJ[1K%ZN2_K.߀9tƒ{(#,F)@" s!5'1{CVB)Ч wFn\@܍߃v O^V E)Ǚ߷'aϔh W -= l>Z%5mntQ@TYͨ#/x[@TJz >D8!Ҍv4S >hŎ|X2hl>1T=UL&`%`${&E$ȔWN!IZ{-F.ƃF&p` r'ZZndcp_RفvW ֘Fy:}ȾMUa.SKrK:kfibz“@Ԓrg>p|GEN*}Xug C Q cl' J&,h-s2$/`jm5O< 9mUя_?MY~`8o#e/E̊j_Iź2T XAŢl4~<{!nV WEKܰW*o\媹q'6~% dr/ lR SA^{jp\ ?Nxr2&F9#xM4[k\4 :pB|H(ag,1y44X?'Lm3,7wqGR8ȷe.w(C,m1o"RlѶce<wطݰRA^݌Q얦%A5 UFRct Nm0u>i:z;LHRA3MgH*}Sݧh=su2;hŎwX7QbPcqt9B:r8@|˨;foScRbnE70QݤĠ^eVΙ(x0BQd$'} Aɹ 87]GVT'9 Lbr *% x`)03g2`k/'ؓsF3b\lh4:*6([H1߉Z-2( QM a£kL^ q za|&^~;3lCݽLJٱ*U6T ~%)Z z&Xc7~Ivaӣ, =/IFW5eЀm9jDy_ Xw0IIj-M}&MÚ*0i [dq2sToBo` Eƽ)稼tjӂ 1qU%wbϪߗ^&bG^u{VOMkb d+% Y>6Le@?^ AK51X܋T4d*/Ji))qI@J S*8.-$Sb-Eh֎bHM*adT,ccbEm;3Ypg\wyW۬xrrxqG쐳d&#rQZf﫷ҴѶVckεNeF.b1ʦdi=gOSa 7h;%h` T[,و=9;sWP{2x*jĨ{3؇<D9 _ER)$JȖcZ]`DEkT"nR ~'a^Sq6U (eR6/ۯ~V?zՏ'o)9rAq{.}HƔZ &Ⱦʾڌ%ZpJeb05(ls)95k$xjŅPj+eqe"Nj/uٹ*n* o88pԽ_gмu盻^+s~O{m|~!su 7^z.ewZWnm~xg5OYb_0בrk_Gh6xy|Uߝ񴤳_F2~BNCdwu [~ O!{VۅK &?1Ft8;놬>Visq]Lerac|D}0!XC&`oJ4X|>=qWŇsYAf8 bu7.n > t6Z:[v stD#<`wΖEyր3\{'`(>#L=8v1[?߾}I}ƒ| iF&O[klfKl9#kn&>cdWZc^59eɥpvNQ[fp:~osndN=}s thٳ+C3厜=RvO |58\^Z0u7zwdPL;vwv慳`]0w%NS{ȽV}|>'>@ilsX'dSNJVq~ݡ2@4jfw zRv@Xx7 sv/>}ݕuYaﲯ?Më zE{0xzKՠFڮR#T82HC%6֏Rݝӎi[XEibtHsf's# E` 5}5qby5q s~YD͐x/$_#y 5_~D ԹzdW!GO?ۃmśab>vzdɶ9lNMGY516g BU0ޡ ^>H뷖tng<=UtBewZ>!{hV0zKpQQORnj+HMJ.ˑ h,'Ŋ9EjD#N>_}s{R]#\r~`x1[m_/\͗|db=(c C99xUmAg_?yA-]B;z) rdԡcǢL)tvwj&0NOЖ34Ƹ}{QlGhG iGy8t^שWfw5fHwh4Y1hvCK'ݖroH$F!+s7qv笣|?tòEo7& E\;V5Di7˳%o̥3Kֿal/><`Rwx~Nj (A5񤽶lsBp~U"Uq Ɗ-6%d -1Z f^|,*Pt=n) ;*x{Z$TdtC@Jڨȗ p vXq)48!(<4*fҏ o5p;o8g((˓RT˃:|]I/®@%s#ڈl9@(h0I$#)MS}/@1OilC?`\Vjb4)ʰ̰73dBE+hJaJcV# ER}U:f2b (_h,&"m_*vZFok0v4b|']T KjNC~NzJc-dv'r٨Hn&Ad :j,b8' 0tbGI *qΡ'8()8ؙ#~vT7V r ֐lh}ξZ@*7ɐD\ ̑F+0MڳHl NW৔ HX7+B^vOʺ@J2@k.>6-!YD^OeS Kgtw)ȲH B]B6HeFXIZYOp8Nd3ų  _Wg$' cDe 1ؖNZCB?EC9;vՙxcO>..|]?/|k-x %#l˃4 n$I|tx K8\K6#Kb%ї&r2ÝHHv9Ǎu E A-)J$RDN+k2hdV8LtaXbѴ8{*ċD AGTzLe[{#fP܆m@⭇I,TG7Bg15WwNU2 Vɒ3ALp)qcM1D7.jUq'ȓXRbL;iz[ BA|v9۠-^Ls  ࡻz*a(#LK]JS v0a4KrEK Tn< Ak^/B]vnjN$I9H#ZM68A 1N 'e#Kpq-lc:LUV}4"(=hko$Jß]׈k$ݍ_v30x:#uk[3vJnJI8VU{X!bw72Zdp% ~XD *Ü>Ft^8YT2L!p",UPk6֜?cîuiY MxU}zˬE(ʺXHK0 H̤N@ f5kk80(C^b,THs1c6G< Z99V:(1˥P4LzB#RT"YZ R҃0V#vkn$ !>_\$bʅs qrp ƷӘ]u6g4+F,T -,J1<*G`'Ug=kLj5(x \`IE Yq2"ƀVm,2aF+R#tփ*\#|r\ͤX LGtՔqOp2Y#Ap^%SCp*R3Q>t8kH,7eZ?˙-+Zk5 Ѓ }bM Ǎ0#1>dt=ޚ=B$aPb.x-ܭ_x "`~V;͠}6U˥71З] cdnVBM64&D$>D !zRh0K؄p^t]Neg,1Vʩ<;s@0WZlJWRDq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@D! X̀ p@vJɉ 9 5q@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@}lޛ"ယp@ZUr@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@DAj8 ^WZ)J5r@9Fq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@Dq@_ ib=НV}󤥚Ze7~N;ˬQ^MG@-bKWyO$}ﺜ>3ņ\Mey0#oڭ}.on9 .;T Ҟ[d|C66pϷ%m9ʀBqۭg7B;" WSwUy-Iyo 8O6欤 8-׌9Ixe$t}hjU]ק E8Pg b}eOT>]KB)Zy[v߀4V#j,wj%o7-A ջiI若y7ھWڧk6oG(^-_,}Ԉϕg}h6],oEt!Z7f[jF^ w,sKshM_@ӡt9;-Oj$yڊP#StDF"%JD[ vǠ &ZaQ?a 5$]KAxtϋ yz3ͺ#ץ N* *8}?֞߯|"bgw#\>m'|uzλT폳i,oru~@O; o}L=PGw{s/imqіW~˺F(n,=ٸ因bKL@*\eXc<4riyH vJڱjs_.{skNm_؜/ZvX:ö1weC][4_]Em~|W V?ŝW۳wEÏaq) w盺\rksລavO%A$O+^yRZONpGqqer;NWCV^~5Ze~qs7ܸ7\7ok/fUY︣m .Nozsq/Yn*ׅZp|C|} Dg+S1y;a>sOb+Ħ&lt5hx;f۷%p6kW$!L08+ r\k˝zPq=D>€z0s}/ŐZݯ1k8}x]0G\ۯ`|?>9Y5Q~OR՟1Z]SIwz$T]Gx1R#-_mV̡C|zڊp1mWTjaO}&2sGkH T⒛u\\ fR%Ĭ+9r<8lqu=y4c!C^*1ge D"WpxᬦSyA-4//܍k.)pDUcAږE[QW9e#HU`2seɍ?_,b< kک:px~v؝+ >7:mgey<+[:^NNj&M/pr#t3\?=I|R"ȓmv2'=9Aچ=e.}͟iycЌϟ1Ѧ=8OiZ?QI'1wjC>sFuƘ}%@cÏ@W2D*I>_&&qP2Rk/[m$Y~{7V2w?nSgM|3Vm ލL({Lܯ[E*o O8+uO>2bϣ!k[uUzs-)ljmO3^eK)Rb߃QWj'_5jK[}O VM c} ZtՅrQXBm7ۤE8~CCwЎ\w;yP#/GvsjB`ѓDc{c/ 7>Mk pݧF9/'I_ Ytz/ܟݼy>`fF^vVtD3Ԫ!ΞلU;h}wm2<_Mr6G۫٤4wqx:_=/}gE01ĺэ^6G<.9=Hò5Q 72Vm9;gxaяw1Je„^#s""炵c92%C%k%ѧvOʻ1R.cB ;8 ¬6zvB;:}P&fNat٤fF32'pc^}ϨcY-S{G}* ~!t)&!rX,'ƵA`%je~*`&&P e,^bp\ɥe)bfϲS`3zxӣb>AC 4KT'X_^p#%׊o6Ekkl:fTRReimN2\ QK13ES(Lae\M&JL\u> :`̘*UjeN6d0s:@z ژ9FjeX4E8kŔUUk ,C[boǸW?h;:+济AC58XԖ% mBi[tq' 򯭂u3 ʈ#==#zS{Ŏ{n }Z{9g|an(,,墭)ϻG5:u>g,e1sS.(:V (:1e~;ǣ $>sΫv&kc]!uAc.3 SUQd^xB^=lAy }\*r=m=4:KqO{ @O$}^Hh{6tWC q]EtJyUPӠ*$1+Iu˾TV{.=7}-/QאXx A:yb*f/7H<cXqc#-wMU[% `f֡U:LU%$#/$ @Q)EUN:ٛ. ç-ܤ9UՎՎZSin zt^ ~\9v!drn񽷗6 }76}ͯg[VVj#%g S?hl!t@mGuǁژFoDwJG\*'UB Sgn~^\L1ބb]l"j4cORlL}⶧s &Z\4J+" Id n 옉12ўҚ9~s&IڥbC7ϴ܄}>Ch$g(%1" ZJ@Y "XU\%G0, <@,xKz W^KdW0I,9)M*R$ZϞu({tb~[ґ['1S6',I*I;dx\a^]k<:6!"Ye8v7jI4رB^{Q62ݯEƪDUIv|![ ) RQ,?)F ;#n,:c2&lu M&#FEXYPKijuă)k #m@]Jɇd)8#ԈX5§Ԥ1.ّj=vAY_'n|XEDt?Mic:/<.o<^u\N=UWxR;mj5Gm>\-*saM}ׂw{iR54uvj9LOpKyzg}-މUiPBN^}]bTr]ߗȪ^׍|mêH.[=B3Ḛ̇jtIm'jݸ?ϹOw9c/?mi_TtXǡ:Mm |*xn}WC=؍ VD}q~M&]}-Əfq Cqy֖mfjIW@5, ؍ͯޭ*¿nU6`׃]MKecxQGԳ'~>[qOK%y2?Ő>>dF`ywvϭm߭wՈl b7n{l@ydTk{]藾Zd>|v-nKRr,lkEHT`nr1?,>\]'9oW{;dc|)1Scm$]y?tݏF^>LÐ';Ywj Ҷ?ݴ9lbXo/v5YoE/Χ`:)Ko)RdW#_v|[G '7ON<RQR'+TtH X0I˃)f$s$ TIffܟd꤂t.HR~U Mj Az$>5ƼY`Hh-L C6 9r$,xY*ੳTepTJI0(HY`RFٹ 퍩P,>>n&mӠ~ң%?ssOr =x'%ڤ%`/#׿gJYwrgN^MiofGB^Xc&H)ǑiROJ9P 6ŒuPPآlLL$zK6 (Y󠞡# D#'▀}5,"g ! YЩk΁+Ým~ 4ʳˋ/іyg de9_,Gm W̍Th.:ݻY~Su? ^e['.ȝM[GjOᢋQu#.p3_*o6\MWzwܝŞJWeٛmÚ^SW"QӝPِ׷{os·|V2]Sv']3E="ʼ;qW|eCw<%nt1rZ= ugZ 6bgJxaA0m'z`͓0dO'1!UȮb&aA ٿM~R,X(2 U^G&H$gƇX' Si`o7ٯ`{gx1Ki^5i)B,XҼD!:!rHNdikz#[C)jgG(XN#-5DLA&H#+ 5hrtN`zuWTzn[JםN!bm099*bBGҲpf_|QZIB)$3* m,HBvl[:H+cѕKMIsQL&и\|9PRtpYU0:6RC[2 f cm>-syU=-.@/,Rtw0ίWn}J2ʠ,LuQdk orA\@"/֘Bѩa}Yh BB(fSYVg'}v@_$c`mITD}-`o|7 XXnڣK%t,_cH14:}RH~L ǂ񎵢î Aa1;I=CЌ"Ƕ&3e%IL %Ub.&Ġa k~{؂enjO"&ZD"hGKUPR2J@4 Į"&R傱TIa1G:ؘVuӲT;sLɒSb$-SN2֜PGe)u&%E?]]NBSP{kAFJl8N%ĔL0M.v1H{İPI&l wяEh l&)-iS? z.)i}?DNB54: D}Sp5hǞȘtīiv1>0s{/g ֻSʟ旫bp`YGJʤA(lVYTYJT ]TfJ5{tCio[}*MV=ş&4au6ru9!S댽Ln͝{solwW9,J=KR]Ꞵt#rk%tN}Nq6~`oifܸ>krW"*v|Mf{&W0$ Kr(s-VMchK'p<;Ø'vɐ3ٖo"#FѺ֜Ś`#ZRH1a ¹! gJ:Ӱ.d9RRh" K$iё_)/ =XsigzB|fy~9/-aw_b2 vX'Mq5" BB%HMF+I_x )5w0mzιק}/!@p(B$%|%NEkB7T!b*KVg9ldX%(~^ 2z Y9/MI-^ g5@=;9b}_GCP򩤢Dցl,>ˢ6rVE࣌$} *0 |~-3C0BdcP^yع2"T&YZM23p)5$ j鈓Ft+ũ2d}@J1ILJ3bvd)II b, &$v`> Y Q%cE5_YYI/SR1*@mn[̈́Fϕd&"*XVH( XT)JUETwٔd¾Zbs]é;)$H0bX0[\Zԉ5wm mюCMҽŶ6k|jdɑd;b=䌤lɲNƘcxxՋO׻yg~:@iz" |ftwIFS z˻BmoҮ}Q9TTNm+mo2<3Gwś?ڼdDx<*O*?Hӯ?==}cgmI %_jY"ŒCd8yD?Pvs%B0z!AN%e`Mr9MwLTk>lWmrqVcԺrjSi@ O!+px/-CS˛`\5@Ea ./P@IA1ϞxBQgd<G7Ӭ V@j*|ٶG6s&]Y>Evզu,Qͯ{{'2QRJL( !c; [ Mk?F66wWNd]'p{PpY(X7خ?{]qRfb` n\ܛуO;,3hɔ` (Svj2M>\t2*u-[6`$=:+Ԏ'kX4 QS)fZ\~%Ci*Ɗo|~``w%ownHa8kۼ-[[VJIZr$.hdmjN6@'hCayr,^!1 $; Cxq @Khc%  zμ<\u]!e0Vw)|yQш8=2A*CyA 9HIȬ@Un,R`s=FI={}4oͨSXAE4\M!ܤe7PG#9Qg藻oo5c?{qP&;\&LG67D6Ҹ \QVdڎ%݌n[(--d\R1 I(E"0"l>x tٙVn8I*(.`KT(HxmrKMFz^F22DH"DDD b FрG!eLDֆxbRCk\ .1ǽհ{SdKhVhKKs=4_2Sk)$&*_D"D % R$tH(ịU̽1`aR3hQ9b0XJBԀPENpqW `~\o\raܮ8D6liO,%э,Ml'aiOzUH\9oD C3%F!`RIK 1aG=ؖ۱-mے]-7Z-ٖe [V\E .$ҔUcJ9VUwiVzM,l˯ݶϸ#wfD{h ՃKi ˙H,R(Tc'u.eG4UGʹh+t))p SջNaean+IyuGEmNm:odu=`hDmTt Z7$&ԁ%󆖚s[Z= yF[km8?DQ+;gsO6D+R%1)܆uՙV%r2XMpWШFeΗ13O@| Q"!TEacJF}`J=8z*IOI g_uf uW24THm]Grْ֚8TzY@yQ:J0+L*FcCXYBJzYcQ`)FhC<0 E8[SAT.rE y$,Vi3RSZ^gm5ROm gM+kM:j/Oe>5l:O+T)%>5 \kib)O X8c Gdbxy ¶=rב HitL*0oaQiƝVc : (O n<*;1cf(`|-_7:'k6{.iR)lOǀ$C&j! pKmMaPCy -K¼r&lhx" O+:,vAjY6q9-]^$)&;Y>cJ$N'D”\ 6#vtˋb?vq0czV*΀=2"@ET41 t _@Ψn$k3}Lainj"Seu„KsљǴEdY(iswHx:9נ/@=&yo2um'tRk#$E!=: yr1uo\Lj2]&OMô]H*Ʋ)X0Lqӑ{k+4EȢc?7n7Nd|yud'ФHAnVP^&6~rqt8E~Y;]UW.{ LqEYH)AJ$|@4|ܞ*g~Z93b'?6|9O OS&}ߋՒ7\Y>=h3%0y˧siK 81Ge~+1B=Ȋu %RE~9$\/~)J= =8u@PHgmc]\_WsfooL{|[*lVUD)]Fj5Iܛ^~_g/tJ O3}٢ UޭΜjs$M[i7xdﱭ"3FXFvFDS2DO<,}ɵhkR`6Qb#R콗vt/ΓB!Rœ@J0XhrDfYB`z4a?Or>SAH!m|JI "êCҫ V[6=Aɕv%BLbR%c6(SZ8SM7_[2ĀQehkSp_Ic#k(곿L\27ŤF^vA҈^ Tf)C5URmtNJ QD#Vx.XZB,+=,CF1«\ۗɷN|n!V݉2(: 6e^30{-#cѲ;NPqP6S@EuBwmWwgxFyWTfQw=nJ{Ë'IOfO·Ƈ@ej* FJ)s`xdUsX6!(T=`MS d`X7̈*!3@7!9'uaUIwc? '7"at'ȃsɴa߬"F S$1EwscsWhl {F1xxm%jn \ajy^tV0M蒄s߯ W]qHbxR /98j5n%[{+6$与 +N~R.w.uj]VWw=YVfpq]z5~f&YVCɲɷTgs^A:Ӓգt\Q¬WNYs2Iuc~WgyGEJT!WiWRJܲ7 ˙^ ޣRjmm8 QPr㒓Ri.JҞI:8 Nٶ['P5Xm$jf2` ~!%^1rAHBS2r m bǎ +FbuNDie w)DTdU2ə@B[,nQ`HjXD|0a!uR ˽)vl[y\P0GL, E.\;jO6v44kG6lmt^U kU+?~ q* g\gj{ɍ_1)9W$ pw|bQ"mh,,~EIKhv[bwçŪSITc$BscB1}*s%Rѡ,+*c.pMd Z-bZ;U@72vidUaa78 }g,# ,?4f!3ЋO9_0t:8\,8bM`6Z0d+& \Pxj},DV^Cs"&%o9ġobF|`|ͥ^7#v7qFl]y*M;EÈ#ݧBI+:` A{_ 3!`cp2Xg1#] b:*u|G$9aڊ2M11hJ+¨:VYjq33vi<\9 Z0 "vӏC1vF8"∈4jĚӺJ^qP!H|05Rm~l*P)j|gAԆ3Ĕ-xĜœ66mܻR7qFďE>5 ..n3N)MKӈ#.J dEL62ZKza]!\s4>Zu(T":[Bqx*xM;CP}z*T$1m=Un˝z}ª^kP3Xac${}>&;?gO%@Y~^'99oݺw)#(T!iVUD˥o-@!d*i @IT"%1,ha:vN'T u 8岊;bJC5W;VK-(~UPt Z63愨MJ+ŪL(+Gfæ۳&F<!5PXD fgud66} QUyIiQ䢣mAjM2MuQLRB5^$8ϴkߣrT̤R|ַ6 ^Kkb .',A* Q&nފo8_[6bI].XxbC,ʩLֶx[ Ԛ[F''~҃5zuAJ4Rn5 b9_8g &1K#ptTM'H`?+ymSڪl1hU$ TX+!hm"""[qXnZa9ME5Z6>U(k=foQŖZ]jl4Մv"6$kCܢTrͣyܑ:MCZS?b*mqau]u7qE$ZmB˜lB&.zDr鑍mHU m%?$:VnXnm40zT~zhlod{j7v;zRt:eݎۑi>Zs4uLp3W@Ju|lA ^+Z$49kh]Ỵ~ xO)Uk*MH5sͭxHNB@ 0)C5653 *M+J ig gl:кt M0qJPu3ܖGxfՊY܁~q}jas(UE] _rЎM qfbU&rh]wނ.d3UvG-ODG0=cܛG߼o"il6%۟6VZl]?Zl+:m`Jb3TڹxCySQ/L5Ēx\%ˌ.xg`k1 $M*Gx}Py%%qo |K býIڕ<W-wٮJzڶ3V;ջߢ)Z}v`th n0!&Z. $Y&~̶\wԶ\-COq>NBpv`(w5SjȚ@WSx,EJ҄ 9Cˉu}+qy mgTБ_Iޠ5#Y"bUs 8lC6'|ۧբtJQ_wa9B*1aM`Up/>z )# 3i֨V5/#tAF1Oޟ?b/c>:3& 6Yǚ*]b2*U1q#Jhݬs`}4zZġ*R۳ !; xs㥔,8D PhVl\QF,).Ra`R3ݾg7OainO=fyv:)&糭lr|<)!fɍ,&Wu6_Im$hyMO-F[%BsLO^>(w*͡uZz0+^T,{kv\-ٖt]€rz1o%zz7 :1I'%*Wf_g妥Qmn;3yy?k~Y{֜[*jsy@>&Qm"/q2w6y$VVh拲8gOWYj]Q}Vs>?gR.~4l˖3zIl\ N) ]N?M6%cnj[jP^7.ϐ?sҢ?/B[bk?o5 l4ɝ>|u?$oH0~0̨I&ޮRUyʉs3iͮ]$ZO(0"hZv0VxxxM yr8Lm{^BHőE8K6O.]-=Ǜ\/o6h𸷽"m6w=OLfĶ.j}c=O{touoeoBkqC6Ǣ'gͳ')Ft6Q tsܣq)TC Zl[CĬrtcK}qbB u4iH8!gzQ>F[l]k_Z我I<^kpR>A5@v'U'[Hv~xxNI'9 _.Oշ8wH V&j0bSn<"}#`N4S;"};Qd9‘bg! \AlTv[m(t2Q!+MMbc$M]NYǒ|d1&b1e=Z4B<8I_ p` [5m.6,7_'I띞 ̺6ՠ9ݝ0w5o2ap{Tq@ mX䉇t;/X>#p2xk))=(I*r/V_Nckzܛvyc]|C9E"u2&CjDH)( X=S?$\O\3=w }5W1B;0/h#+_wffɮ mW& (nxRif)5ݮ]_O(y3SO(nuj{PY c\Ln\Z 3æAR[tDjB.i5d3>P2:( JiuN11 dTpw9r`w6n|:7 bU`>!fUq b'ڢk%ŊWUCPXm Nfɛ7N fs$ֈ6C5;ԬńuFД($'S8ҿ.Eqك/MA #}{fxש'l# gXe ό C_渜'S^ (\1镻L0XT4TcWwnz#~<ƨXE]J'Qkt,KU&2kt}Md`U%4ԐRW[.5WC_|PP{o&L! ;wӞ'_ޢ<^Ⱦ]WEp>/E}bH;O[y{I45&'ڙPeJR#A$kOAg0fRq\@gj{8_!iw/cu{v$wHA]-+I-Iv߯zHJ,ݲ){ Ȣ8=3RSU!H "fQ]$B0^'- , dn 1@szseeDw瞏ͷ;9cbF_],}Th'oF3h hhBoF tg|^ pW-W^“2ШnJBZ|5:Chrqu uB?,rʏ}рthY\&2(h!FG\U`fa{7 ,;:M/Y1fV|/^nS s!QWùEoB*gOo6We_) ¨BC/?h6],f.FZc:Kfkx!DIop}6X={nA,|gD{u)a}o •κ̄r-9?هUVԷM~beGXKq&pMϖ|ϾًgW~f&ͥ7w~roپ+=r0Qy\[_*gPR1EP bL+:qdUu-JcuUgP"=sf6הzz4k/it5mhZL3YG\R/Mw} -Nw={v*r-o\;/cwn-繯 kGo(.V%23\]v$M#P\ Iݟbam/|/cξ|}ޓ ]~Fa6yׄv<~װuǃWOa9yFOuEw@6l <`o#ż$qOyu\B[Tv>%_|B7}qw%U(rp6IG?d$!{mBbUY'H>N p_dٟAHG bI х 7J8DTm w0x"XPB"SLQ&H(5X4kDXQUg.>w34t >ɡ`L|0X -=|ihlʻ#WI1PkS Yb̎ʹ`QZI>QxN.iJ5|%̑/EGGQEcajͪqfXL2zn Y}xKB=i1oe.7㳏F##vHIV@Fca/AE 6($("(1BrYU ؋Y (KVɐ]aNP$:ؒ|q:&d4l%kn@ޛ*ha\΁e=Ƅ139$ה)X+J1bk$4[IcL$1"Qfflͪ{~ϸ b38}cD"xoHR%P2QdX%f`*ڱTikBX6&b,g0c,Yp|JlIKTt:]%P#|tLkͤP\ q1 8}_# D 5B20κSI^TAH@>M. x,xL:͘q-p#@XC' Y97x?~"Gyت qVkwkIНWE:But:Сi#ѐ­zhL͢nvЧmSH>jgLʑ fHa:t)`tQٕkTީHSŸLoC`1LƹGyxZgRhłd1 bJLHWDj " #J`' vSI 1DOB d#Z#ّ*挊II2Kp49-.Ώ&'t+8@w3U2V$mɘgeOZO FH@NiHx30h[Ōeay4ћȡ+]} l{B%Mt*` j^mgZؿѹnLnwLTLYx'n ɝv? v:T2vL&+l}=Ky!/|XhCH%dm%&҅贖٦jâ(tɅ N]Uke-. Ev `i`7̛;`|"Fj0}'~@~͘;{ZVLP!#2`t0,^ˮB$C2/4fZH%$(qAX#}9I"EY֥S18"!/2R&&"eQʹ[tZ$EOE":a:*9b戌 >&hg̲DkQzH⢱X/cb*T ~/ǥٌ;e:)v3t>XSsZAld %Ȧ7Ԃxخ ErR'!DeX 3H '\d6Ii$A0RR״M:$D'-]ѡcXTvAii `N+MXDhLUgL3;M߾'cX[ _䜳&0|I5EV0d;:<'R@)%d1 ʰWdv?GAgˆ/_^Ιɧ{ym ۯLOjwKjz}~s%#LI9N΃xPJlH* J~63-^zBπ<`0"K2:-j hE'lmBЁ"Si<4rz`'9{#mAa | ^&)lרHʀ7@\aNpfZ>(eI-qdhWDZ[vg\*و/BwpO+Z|0bBR% jd$ :l]TAen`7sj}xFeLi1v Ͽ;fg3zSv"lq8or:ݘN_Y>eȓK8aMtfzt@bWk}U?skښ疇Q?c6[otyܧKi̚ѺWW4>ɢ;nn2v[e0TlgW ӫ&<^:&A ڞڊXGtYݨVFwPeoy|{>v~w?{۸_!)-=~0p ;$]O[;1s$(K25glvWw= ??\2g Nu]Gm >)n`kň5%E _)Gָ0e6y:$eNVuJ;zU9 p9M) %eHӺ6ZWHM~$EӱNWȢzQ@m>X[s Zaq1wO]!1WAu6ߦC(5VE|#?yA~-3&Y5R<u^=|x]>_F8EG07[`YWU;a:91Li;fJA խ|D!ޡpLmN79܏+iڮT<5 Tz 1mLԤ"c!_ Xtq04dY1M5:ͱ~{u77jhV:2L pWkU 9뽗8,5D xič>iY:4k <./ JE^AB: f4@}p PH+uB(śUeFr_@{y.A [/^ C)]~OI4BIM2?Yyڏ"MQX3\"6HR"%3#]>iJ}E]>jNtU(XNpe80-s%WϤrj\_ "J7n4DfN0QRj+3!\#cqS[OS%:=[eY)HI*u9_8>+aQYmUcm)NrJ&ke-e` 0GM,ز MX5u[MvIɹtziE]Jev-63ae(E&LPj7u/RLg@TS]/m.s fE=z>Q[g#(9g Q:IƠSIr[N5&mZ䍍5F72G:Rb&y$,ViǬQFYJy>h=,8z>b{ybC%kݧWJTRFs= *@8'$=XGNЖZ5V)4F`\',-ࣗJªqLR;mTai Cp^\똥E"0$wBYƬʙM`$6 5–RYN,`6,왦A!͹n4Yn|Nt!:dDX8b9a3EkhԿFԿϋag n@ ^^_j{0 5 sgr .u5H'c3ɧ< 6kh 5(¿V`]mn}O~>bF ı υQ yrUZsD`p˄YK<8J/A6Y;˭Ÿ)>S9g[44:vFjs7Bھ vϙT;fKwK؟݇2A3m̱|:nmGH(򹖜N)bxЄЋ!vʃJZ U1siRr01T)b+t@HmojRz\TknSutTnv,N7۰L6dQ>{ l Ҟ߇re_"rtmԡ/ðƳ`ߖn ؄A7AJD=@lk];N:J0c͇yw SZ:6W~' (S`o(-!g`QMbF) )S_hMj޺Ԣ=yA!EeDqaSЋ4eK$z>RΥD[ )&‘uI*X[j]$ETZg_PFYN'-nl3ٮZ䛮;ЉrFMePf+c7%Ψˀ2`Gm1ꎄȀ|`6rvb}V/lV"J<5Jb A#LS+*S?[uE/-:/2Ҩw`7CGbH/ $"1%q>F0%vfU];><pFwwiݝuK"jg4?x r/#H\;/r[*lUus"uB-vK}dzŀ=/twX0xON6fIugߦӴ9J;mgs{줡 }hΈ(tG"/-)U ֞a%1/Ug}jIX ?Tt)Qea9E ҺZՅ 'P*k+l>ȹž]ʹT6J!۠LnL )&]‚WQf2 Y!v3y%<(&16Dr- :':&\z/d[tDQa'{ W%x6# j.x nvtH+ 6B;M|JnM D;3L+x "l0Q: {dVhv2NF0H~0s-jqEcJKu%3׷O%JO1/: HG&X8c/CT 9_[o%=тV+OydftzcyJJT7x8)%Wަ?+tvAM:_7xU^z?}6Ϊ]?_\_ꂂ uD9hVy0bw1"Jha!iX1ahp0![nu '1beP/²RO̿Hާ&鞛DD H^kV(ٺZ;+bPA9=DƣiA 7<80#+S)Ƽ%bMT)l5.-z?* dSkwt䗹uULC8=N^$ArDrK3!‚0Cb+wf$HrL#+ǐ@1咤1cXj1؞¦y-A%v}AhZ\4Q/U\z&ߓxyR|۪"J^!~ɹ To?o5V}6 ߓ= (cV(bOŌ3>H1' Ey-#?oSzM_u:ӻ.#Ҽ%a@?7Dc:3EE(5f(Y%IW; ?^`DЩ~7?]AO?~ˠ%ˠK('a:ĮHQ`p&E19wnqW*>mX =IH`ŀaU"Lq) z3C͛ڑhA\zNUtׯ{fO^h6dҸ]}xmAӟ]j$o~u_?XzAm;n*$/lXr|rB#F&z6UB^t&}},O'}${._Aw-go1t2.[ўGkxqnϑ&zYY*? uNpw%^ Rڮć8gǭ)m^GHWtMNmkGe S~FsDTH#;+ 2 rnn>xj~uż2 x>[ls$g[nslICIZ+z.蚬*KqZCZBȤ nPuH=nX{P_6F66Cl&9Փ,BD]=6^䘬&|R;]2 Q #>CErFEKLLEk>\UZѽ[{Q]*O# S-2V'=^VkM -qsZ[KCU![X%9JcTFKݫL=Sܤڠc汎\]BQMR֖v[!n[ %e76iRLjki{t Y3+$"Fzj"Z䔄;z#Lǣ%D?S;N YV1+!dchX^{:Q,Qޤ*CdK #=&xB. :ރ%vwE`ENG/D"a=.Fl<1iK?T!C*R1^H13dɺ5?FJԜϛ&H"h1ֽyN4djcn{t{6Ya}طJ0g>YYZ壝9[Eo9DV:+&{H\!i$B:2hyCs ~X|YE )|" LԴjPyUA}ڄLkptqXF:мL}򘾬$kIu M< o=A@8%ŔJrA+ПZ"hFymC7YK1Z1pߩicU0Ovnl^=% d}E XVGXktdDPn$أB>h AJ.B&*t `rYw@R@"`*2;{(%g6CܖSBGk .IҎn:9.!z-|B bjwXC-D;f brog)1GG0̅<@2Б5;sOj:& EЉJL `?Aj8wG{8ΪU%0 ʰ"  e#@ A36%VmZbAnw+ 3!$#e/ҍ6:LmP5U#A[)n[f= z捇J$vGc'QEŪcԶaZSQt5(U/ y(YTZ057+` bz5lDʌؓU: AE!9)j~@'ڪ-hg JT=ET*deˌ`*1J[$=|"2PAz;z [ƮXd 7#o6b]MA57w( +U@ &eBE=Ff,(jFb1;z!Ai,&WQYjQ3z hNmS; "7 3 E5kՃ*M3| gu21KCpR 9OH:'/gi:?9@ Q)]xZ$ScU1ڢT>` ae+Z F̀z=ʤ躐fhJ$nÌ '{|tp(g=xzul)\OH&"ˡꘅGP"|IFi:usL\- Pk|A;ou ]]wrt 0H57[h4=f7/{WDġ@Fd?`ڀ J(d۰~:٠ lP(7ٍ;.OU/}7{Tkp@Eas{6`Q}=J2s@1s@1s@1s@1s@1s@1s@1s@1s@1s@1s@1З{N 6p@0׆gZo?{V=)9/ s@1s@1s@1s@1s@1s@1s@1s@1s@1s@1s@1s@_*v9q@08Ep@V}܃Ar8 #9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怾P^D< 2N}Ys@_"4^c9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9 怘b9/pc=qq)m5QvKkuu(Ӌ# ^8ˋOXiu$9]dBS.O,iq~Y@G_ʥqzs 9.ëV~VK'2.NG ?4<]-O՘TW/ 뽮C٘^PJ7gf'_0MQQ,t*VDr~=-gg&H6nf/H(8bVuޚP8#EVR^9 fʶIM>"ؠG} xPv+K|b̓I h@{: ew(5~W-߷`VdOP =I4 ˦ғ4H֢:T$4 i|iPZZ'\˦u?.v HF t9~yjOwsЭ4 a>zGՋbQtRmc؅PGJAm.xIo?yG:Yq*8||zz pWxstb5߾:8|!kv-ߟI /[ilr &W9,|Q;tLQë.,(q-Rߖ˳۟ bQG~8+_|}pta8nMFou;_DhsGS\^S5W"]_ErVH6TFP:PSh6H``1`ѬЪ;WOЧ;Y6a?1,rmgi6]Ԕ m%9Úƿx~.~7N\oڻ!cRy+k; &==I徦R''ő ?Lpߝ_N1:o~upT}s#:EH?Ţ@m-ob}M=Ҳ%St'Avv~<,^4l%*EL Ad,jBn-#贃\Fs{OT'6esu]6t׎wtؼZ(q?TCv񫗧96_;|4>:];^C3ms Kˀₙ_7(P{! u;qxzk{7+@bu>x߃pmњ/RCUnNABXFVh `1*Ɵ?&UBn2~gFU,F0WVm:THb2. x1k31 k cƓFCJj\'dk H4M9ͨ{->e3nx;+c)]bM+q[Gm# yPNm#rX[xkL ]$|'T~2:xXessJ{ꓲez,ln}5KG^ڠ*͢&g\܎U=\m<f60'\i¨*.X}\/]B+dc*?5m6Urn;Տe<Q\/;dt4dTd?&(J,J\"6HR"%3#އ'nSܵfLa'rpHQirB"j+sE9E4 =*Q(*4ou +7 `sVA-m=3{9x%h qs{%;7{4͡x֍w> 22A4doG 0h]к`s ǫWYJޏB<\֛p;?Țé& VX2 -/42s=l=lwB"ĭsU>#evVlT:o~^NsqzƉ<0(km)?y˓ 2ֆ~)׏ysc e & | F}Ă. TQ͍QmQ'(co$o|;w|_g:Dwu:@۫0p"]'WbL5y  /#+΋!P@o5;Vh*j\ E,W8(9fHkcQ`)F!@d0a G'(9Wz!M /St[˲h9l62J!T"z{ˠjP0(BhAq?l%YNwi\%St?Kzcd1Jv ٧E0Lcs"%@29Nrq e٨wmG钸M`} :viU*K#q@V"ju]h̨UZ{0y"cwǟ_<}~6lwS&JY++4O^Rn@ JoƓǏOwֈeob'Y׉ .?|ñD~)drg9fіB`ߗ,YZ׷ qv"֫,XeEVt) On&y<2?]^~$lU~+v|2 zjj\b9@ }<OiVl +koL];tr[T ]FUdOOeB|Մ)8|/<;YNdkw?)S9Z6YG{쬮 }h@Έ(4Lx瑈`AKE %3l"F6 ǵ/*Ɠ*)aN j F,4N9"`fYB`z4I| ֿ KMGkx8n1$ڷݰR9*+eNg0]-6Ԋ^=Xj]S;RR\`?mq0M>i&P>fSb1 S(F7H_=ʟc5h50;wҌ)$Z )-# [ LSFD#q,RbsӈwLND`n{wMj]s޶Z79_ m-#F\]ؙJPM P(a%4HtΕvb1u2&rؼ(41Sx`&PfTЩXт#*$;A 0Utdl8t26ų7__u}[tJܡW+iZ~cFB6L{ PbtK$TX0uBͰRDg6^Hա Zq52HSW +XTUYBBZ5]tfX闫9[+mV[V@^k^NDb(D,;sb)\q){UӉT} \Y*p-PWGҠw_QЬu,'ڭ#݊n`NIĠNbBQ^0oh9%ɨO KԝHWl(AdS JFI̢8hi} k3+rYI9UoЂQ*]4|f90ߑ)K15c#}Lɨpv#o{COTs=#g7ΰ uJ+)]KUA0Vh̨p!WqY䫒"J n-P©gĀ[¯Y ~!%^1zq`:I(rY2LIHYu2Lm2X"@jhj4BZ"2l @ %،c,Qi(ywqHU kiR1 ^k]RWI>7Y:TlHL&q ).$Ġ@+Ì 0P1p9H,:[Ik^~#JDQbZZ :fc Q|چ,iH'OoǢU4q'߯}rܐTDl;*N4yyBOSS1jMf gS*T# D,ȸJ:2![0 H :7搙PԭLe,rjOMYB]c9+\+IHՖՆ[2*da58TʲAerDenWopnvYgfqfuw ?M.] G@Nexo9$xRKD!1~1Zg bYieV EI0Mi J \L. ҁ39bffՆ[b8"QǡR*Kmv`Qra:H= k0{z_i|*iO.Fo(P g7&*^ SAˮjAw,}n4`K>#M_n$ }"kϮiAnlB/m.|Qkw]zhC󫵈U˨uCMz{َNmv.o]ͶE\Zvy;㭯J-nh>45wyUR4~ߘK\LtWmHbڧz\k-mOdi3BJU8IeH)ΙR^F; i$X^}a‚ɮ3x{\?l{DH vmɝW?~f$2c^6A otϚ}$Tx Ɓq.fJ^7}D+yPṮDNE'!d \YDBѫG'8-qTpࠫz9ƟB6xD_hSWe]cz?p0c1_ϗmqPS ꛲S!TܓHdQ2Q  tH>ӝ,;7?eXVQIyczSJd ֠,d&rB0BޫUv Dc((%68Ɯ P9!hh vj#1#*yN't 3|i06t猻ڣ4i䥺=[[n#^ʡh{6Ag@^bŜ8Z)R tbV @JSfd(mϹ8wQqpg'4hy@ɜE.cH$iV$N d8uy7'BFnU &NXpLE֖X?[DtȽp䳖xl|ॽbV9U.,5$DNX , .Jb$)!Y *jV_w w#򮣱>Ȩ (M16*l4ti&wRC[4﹭Ev5ŦxFM j$r(̷H1{I>3T" +d#`t={c%;b(@Ь.wh#AKNu4ڜ[4meRO46x@,zgq6q;Ζ&WQq1l`)L.0qqvD-ve0̰Yd|ЫߗtV&%46`m Q>/.诹+Z7t?]sY[kzn;h>Gg."x2xAp5G˫WސIGAf[ DcLIzgϯ2jLQcLHc>.Ef T~oRw:5#?K`R> ^RJ7anTpscdO_@鋥={q{0Q`􌨭wI5E5BXc sC Jr0riB $$iJ5fRxCBe'0 gWmb$ʚXZi= ptL8D,Ƴ㝫;/QjFL3Gu/Ì;~=x81Gʺ?t2%m0\YZs~zKoLiL{cAڟ|@\f4tcZ۟D vJ=VnTqZ&]JSXR{Uȣe!d&SKBoÀGU&ހ[lmJ}8++] G1\ZYI(r5 jIdSPZsвaIAc!,cEЖIn8瓈*b倨5*@//1I9nSfB = !X h EEAJ& xN%UۯpԲW2gr~a['C~\CڕhmS199*7pY(:]X~aql/܂]&E9*DW:9GLԅN<ݗ["=1v~g Y1>RȈHvw -jgS4jP%$_|E{jiF ttJ6jY *Bc"]uA{FHԢ7 h^L": z" e-7xAMhIj؜w@+6~8Γz <kwZwݥǴ'݋)~*jwbzz|)m񕁣 RG4C'>E`lO^F}no '^=ßW$?x,d}夞`w;`36jAG88/&E>wMb7pvò[O.`ӈ,<~oOl^~ߑiˑ]fv4*WV? Z?B/cu:jӓO'뗈yMΚT*#~Ɵ:FbS7o rYDe2}n`8]ͧBr8N!lxNoTq+lu׃0.FEOɗ׷9ժ57.%_q8V[7|8N5kG[4M&ǏٴX<=}scw$q .[2lne5l+y]|Z.zHۣox$q_V$Y,WXGWf9H Wiel2vY?RޤD qg,ݿ51l=<㯟յ޻9NnƢ7! _ζ-p5T0LFryKR,YVw^gP\VqrgXJQCu <}Wj檌ZZEp98=h!gh6F<÷|2zu;\kPg> ԙSgSkd 熿^66#J'(BP 8sFrzҤuҵ 4+h a'0ujPOԸSo4t? jPtWFr跭L¦<Kc[]yjz/{>ȷgwˆ51 4 *HF%ac11$589C)b}IrԮJ"cL:fX S$I^lC0eIf[fr{8 jQwuݼj/Mycy 2˗,RXcuM%PtE7Å*Eq2ǓŻ̩d\Ḵ6vN޹`3ZRŠ2:J^\]U%5E}@P씷}B)AZjKQ[[9NU@!7θr:+\%8yhMꫥH9Pm.F֪[1Ww{"ӿ)^9_6_ˈxlWei#QFgyA5rݻ9kMbj~ar `CMH"Ar(NWB7N9pO̷?z̛C+>݇K|K@>lJ6/?īJmw\_C'T;=9+q$,et ouB6Ԇq*m3]Raw>`G. p@F燌#ېjQSY{qU] FP5ђلΔtd.&XG]l,>P3Y/R{8eDY¸׃BISE,-(̮Ĺ Ʒ ج W>FR} 1U!=i/Pc."N$u.I$\8@ړc Aa=k9|}KnH2ɣH2v{:IZǺ ~4FkZIƢrJ;dLDzXR̒ =b!bJYx )tI6 - XM8#$MUTm9#c?-vB9)5nҐy՟|_01gS,MrMU5 GI9\Pxj}, T+%{"M!{Kr0CNaFr`}ͥ|]rGlTv7xlv`x$/FJ`18355!I] :*u _N Hr ,:C kX7`є&qu$Ւf趜pL꯮lvΥS;GD"xjNZ*y]AqbXBBJ6|j+Rmyl,PR>XɍcAkj3Ĕ_P0gɤ PMm9#׏E5I\;NJc8tFRh'` ;xJQM,X09[B1nxo2G{.^/+q| dݑ3?揓J]]TxBhuovITzFF;Xa@pu8Qzm1T݉uOdݭ/lS;:'f|Ș V'B5@ UNƨB Ҭ)Y.}1kuSL @TRј`Zrh:%w[}bCOHux'y){C y'ηwn/?QHyr5~ȕgmcJSBg ]UFy]<+鈷3`PeT@Ŧ%+&@ fg5Bb66}mM&o0>*-\46;D%lCq=^V"tn9NNi=`(xwo^Kkb" .TX\).E2TL=]*P_k+}c*Q'KTGӣE?u0*TY [Pאn>zE|_פ;i}&߻ٿ_\^f;!BZѫGCZ[ݽ׳?`TgA`KkeBFz }TDScc5re55&m57!PpabPZCVنRknEXOBudRUnH;[8,MZ쯡IL5SҺ|4ma.q ,nA)܃hs۟8 Q<5j jt, 6kpaUM Rar.λBoowjƲ펖CT`)ڛlح|;7/hţ7󫋲lV if:$:UmlVq lj\0 NΛ*oX^C-9oX2Qy6E$ ZU`Qi JZrZkHW=l}xs],=*Xf{zT׫۶'BPOoLw7]ٷwi}A׿  j8.4DVah:!礝r;٣$&AIl!F !$\&jK_m)7ZueP!/C^M&ƦѨy&.ǬD_5p"M1e>3ڛ!rWq ->}^680nRKS5=wy+O*nl0K0 b&Bx4c͊Xd-0~1w_k$OہT艇i %OH5q :~tH~N)K0bR+T={N9) b7C*mvʛR(﨔рC%(m!W!S6ւirrUHD5R!5<284.J(Y R„b\ ё{]9*-*m9f7Hl|ì eBEm`f(3K2 BZI+Bi#,R:0+PI;GDH*(V3!"0I(BUR$2$9Fs J*R a$s S`)m8 1~p~]=_LVh A."C0Ա?Y".h#oa&8p Rɍ9-s Bc"9+%Ԕ?"OJ*'ʣل XdQ*)\,ZbC'@hC@ >&(7gT6 &e. xknuم7? 'E* ALkat3˨UjF܎iDII%wJm(eD: 6сM,._eΫZ[0I"H!xdU!ĥTZf#H4,D,W_Up}Σ#*tLZg1i-kvO"rkk:s&eq˩p@r6RG4!ې*G"'%oD"R:p轎1tC`$Dޑ  Y1J2a:r僥R"k(ܭdྀ )iULsw*V6n$FOϓo2+H1D$br dNi4$-Pź "e+xFEXFq/H#uQVq|4MaD1ja~{ܿji}CPOàvLCpBq|,~XěF4+h4;н;vvrew@2Sw,Y 'TxSG,%C<vC@nY]r#wpI#}kEj-+;ۜ}}cԒrqz<4?,V'P;;Yv)< 1tӋbS4_&i65;՟oE伇vAc`]v=l|s3ߪ~)f bqs'?|ۉDw'_O~|.䝝Z+SO^&WMG}<)_7Wj{QwoԔ}b-' |uS]emZL>{I~;YoK:N.2=;SlrRɤhn~L&Ϻ?mE>k&fz}άl' $׸1.ލD%aV5+y %XU2AzS`48`+xTıy|+OqWM3a S%CWՃu[%-KD *#')$.&Nz Z=p<WeIM 81yj\BDeɐIUW4snUt|zmؖ5 Ǖ!RVC8Z'%RHC0F99%x(Ho78iYۇj(Ɨ/uIrGlπțdTH,y47^%9Ͼ6\xB@tLd P7c.Q<՝͌+Xw95o(奊4x"@@m-)zcV (qIޠFѤbK`Y`~}=E?:d :=An;zz-< ,GAM^}6 .h[a~vW[M4S쯧 }|c&3]3: ,C#_3bS_W#͕sE>>:y615AŌk&g؟LVc,5׎HfHe8ux>u~-iӪ*Uo#qȠh#;?]D58[>K?'&uAkE}ٯ/}(u6lMfI9#O@U\Sm@{7 6[wQOo3 XIŬ\jpt{vTC_4,@.6;qv\?au?Hb0]elڂ ̾bcscw7l.{X`. `Flf ~f*Ys1~@$/UT6nRuh1]@]iX+koLzmgYfUD71]]=f.Hh1!,^/C`ObFzXoRK9/2 lyqw0Q?h#'_ ,*v[@ "Q*}ٴ NȔhPp֌c4Q%aN,4^{*qֻc Hs\tSORkg;!}K=Lz :qA@ 4۔yJ]%׍/8Qڕ)+"4Vs"J-kڕv]us,>X[I`8rĸ"([ILkʄ(6DsQqygLD< j(:ZI gZ9w% jA"%pXOo@~tb(d;8^2_lC]U!ءBG 2CGoχx5,F#/:J %2HKH(ZSOŌ@S@=J.~~BOD]c`V402w?_ĥ,(b.6_T^a1vY<"ż)o"Rc{pt;`c_wʣRo5wK0qdY˽N키 ӃMWd7,$FWVQAT0j-ȝσaIu3mLHRA%Җ3MgH!V9_<bk,(1hfZH.$#73 Ga=ϮZcVjeN8HcJ-š^㢎5[U,\c@˳'SKfDQ)R+K4BoH̃n^M> B<@cFƭ&'&9TThA)J"Q%Olp8 h4M^-}ޥޖJe^i/Y yuy5x 7A!T;6XB $;cJ|*5I閇->nݒc-Z{}EY8D!1E/tcJ.hZxG S0IOnL[^}WcbOE_jw~|ꥣ TdBIv3cTH9\*jBUK$xWѨFsđ4 毨cUkENw}!U"o<ʁwтfJUTJ\-r-JJCtjtvG"ucLKiRA!+'"z-$!SO"hb0RiF VA(Eh_rge0*Klr0kCuʖAmki!唍9H2MK&O2qg |v }6prJH]GW?+에ZP<(trIɸ7ʅ57ֿ$U=" %J>)yf+E;W%2e˞~-E4xq"ߟiҨzu:KOn9NzC!c@1yEgyh_ǦBfh:?NVer6ry (Ħs!sqlq5'FX<!Ju`v#ô{-7JwGdzIvbP]?V|f#.MeIM[4r_tTxhgy9 K~~I* 烘q2C^1xC{-E< $7Z@֎CqtulIn|*I>\R6;݃~Bںًog˳/rĠ7bzz~9B+P쑀+\-4C XL՘5p{Ig4Qiq$ U ZR%"k 2&;u,z(:j=P9dsi/o ҟ;f{m/5̤FK&Đc.1ܚi*Ʒ})3Yt+IR}E *FEOڋԘdْ8 !{ӎʧFR=;a7;Kk `Sa0+*?]* %@HMzSΒRAf>JU%Mv>`-,T2Ŗ1F2%q ڄ\!ilRtF~\v}wB?bAۜ믍o\1=1ϫ C~8_pĎ9`6PFjg/޹ʄЗBj6`/Ȁi mA VTH l_ݦ4b 1YǾQG6JkLQ{_%PʔJEЕ:#?SS c(^Q"A\W<ΣxpZHfDk)o؄,!M%*%0r@<~' b7CgD #"⡆TsvZW5;vbꆊoҕƠP&d@űcJyc5 g8, ͆r.ݦ4"~?cy$8_W.S싋3.G\ܪ1 h$D6d-8c`|JU9$g󈋇cnֱ'ꋇg@XwC9n+ߨ1x 氥rQXjB?X8}X+Õ(x|7wd7u4{es-Ԇ׽ lt(<搼E[wS X Pȡ%B$E)}"aZ͹Դ% 8PT*mY @,u 6 C҉l"yox{ wQ'w-jz;O"vϵh\R)i+A*8Äڻ.Bq anVVxdϥ5H`0,Ɏ8|X^v^:x/O3 ;هDWzd1U&:d(d/D7#X+":=z[댇;-h(u3}/vvu .EKR6)О-Tc#$S )m1Z՗ qt~qCƎj7s7 ;NI&_l.ƑS)7xG_rhlDYV.HMި$"ʿ{<}r)_hzTyVCLIEQnz]UX񷶦_UPf՛|?>;zXjWͷؼ,=ܭet׵O?b TBT%Be2X J_#6lZ ]>W;cTvN__/O-?t/m1.+BKtrrr((}ozgߗ//7Ю6D#|Dxaw»Rpm.g?..fKoڍ}!f綁H=.xvdZoôP cZOoAWgAp{ʇrkeeB+#z ٺSWJةwCr Qra"0Bvi )8kG$ӹH^7@QѾ'yTsWΙxjT6i3jW zgPCv%hp U`sqt.λbowdu5톒^_ x2b71yF߈ĢlVծ!t/[ٟvt\eAֶ68,6C57@Ce6wPsI![t1pKr}P@6E!$`aZU(F(Q2F^Sg\p;kv)VَKzurV3.=TYm\wz;%oju{Gձ o2FξX4VN-xFCemT%Be-BbNǑ䝛[6B;0o€u6tav?_dӁN|9)@K)CJfk_TH5_B٦$Ƹ 1qi-L1JԶYFbrQC/SCԉY XgB̖RHEZ*UMgMHdm'1 ֠,ܞy S(d*ʖA" R_.G!1Ȓ4Z0$сRSZWpPB.czw6: ytcrY%Gs)܂~WK^Ep>oD}ѺSOEr_!Ҥ6:1`6%*ízHT0֛^GTIiٞsulJ!eJ6T``h' d+j#ޮ2vAU4G؅ Bh?X|3aU֞[߿RwnVzY3)QL&gSΛB]]]ط,S@pQKE!z7^`tc^W%Ȧ{ܾ~[t=O9;,P 1J5lOXGMIʐ345{纒y<{bQ:&~k82AW'x&w+m}?z{ɽfBkOQ4+WIH).3*D4]UK͖+ 0P٣Db &\YZfq%(vE }à |gdmo/ͅFNOpDj9aVf$G~H:#xr5e@Hf ڗD3d T%QqlkA0Oή=gD<>:J!IheJ>xB唑a򙴍uҬUD)y O)_4?} 8??䍑{}p:2a-q dBT Kdin^ێzZl]GSz)ٝ$; xuM?8F=kg t!D.*MZCP JȌ0|g;"Bͤ1'[)$}b ʰCmcru9aE[Ƿ## ɿ|ӟTVgIM_}Oy9LԛNfϧ<+^ YWCN8|r~Y/V4 s3u;29@rΞP}inWտ߼؞q1"]skdZ 44 NH|_6*H|>lxSvsz<^K 4Al_6 hB żob"j&d\떨)دg0ڷcyN;l~Wm>XkdyܟBec*_u}6/S~ (ֲZR@"*>qx_Moϳ8xA͕e.7xmӂN@X2)t˵(]g/D XԳ6-XjY]-fܰ6zdܴhVUqVX)oIMi}t?K Gm:WpMR=A{5q~}{"[mS_]XdŅ`[ZVk^dcq̔ui̖ᙌsd>$SG|:X*}&P^s2M\jL)k`30./}^C2.oXQ{y1o?.j:iAK=/ɍGz}aig0mGLfcְ!}ak;aѼSns;Q:xݾu{!m 6Uc“@0mVEb1>ՍsYj@Ny _^gXZ_9용 $aJ/7W '(1N ){e8+7'R.olU.vR_) ^W}E-8JrOTBrIl>Ss}nolAQXT3\"6HR"%3#~l+[P|p iQj-{)*W.RpTD `%z((%/DG[NjZhr0 [EĖPƽuH ÌМ4Ŗ3$*r[}4_^-Zg?/1_o} Ipl}?'Az@m*ʶ1G4xU CcvI5@<ėD _YH;+UFIqim ש K auΠ \/!WFYoFD$?sߦȀQ >ٳ?UBYj%5ӨX QjҾǟU;Ch%D(0M.B}uqf4n*d<@FQ\b4Vבpb#VU?4S2~߉7* YAB_v?/vD.I0W2N]&"Tc'"vNH>WAOְdpno#4D͞V9强NfB=w"#TC * 꼺i:7`^]դW~:J @Um:۳e7 0$VAg;ɓFR >/?y/w6"ү_=SO򹬎`m3JʫrJ}CUH0kuI9DtOa:7)rg0\ 1d7(e]WpF0ht:i/9vR3| K3d΍Ǐ̛1h |' ,.잪nwe͋n"'9nX/Kjfޤ'Z%_qS⯦`X<ƵrF(ǩo!F8^ A#t3S_63W #&i݆ QmGJn*X1v˛![C0 6dӾT&D~q1LjJ)xF:eh)`Dcc=E&𚸎Z2і0;NLYG@HI$5޶)mœS}_e I8brm{|IshGLcu:bk+GkiaJ@y2D!`Z<%[?Z!|0^aVq LQ|53co',3ʆ0^h^Ge _&/Ɖ_y[n_U=Qջ2X4GΚ#îj@%|[ލr6]"X'W T(G,<}Q|SU&̘^aU[jgmmҿ : "-^0N2/3>,_zC'gIJC|> (Be%!+x j`]8$*j!zý?nii蜩CNKC9Zs$zR2ʔcFWVH YFX%RDBr!$#ɣ1ӸDV:d Ih,ZA۠RVM$i3 3Ǚ™rŖp\ҡ&&[M;/+i:r5ؔEYis;?ؽg׻LzxEml/ra,2 qcD9-8m &Zs0|CĂGDz:Jn1=f$(ggD2T)l\*`" MT1JaE{I'!X,ǘeP4f B# R%"y{h9wD@ Av]ria9+r f z}YԔ6!1ZāY'6=K/ RPKu!p-hk 08@IRUĔeƆ` zSM+uQQD@<8N10::I堖_.q0VP 5֥D>S P^sf HHMҲ(8f1ت[Kl. ?i@X@ UfFbCV9hvZKJKVF$EN5гCb,·(U>@<{p3bXGijH$ҥlVjd>~Q<ʉnrPl)vz 3gm+#u~d- AwFOM $R%E@±ަ"zEl(l̝Qe@ߏZ-$}TnK7kʮԶp^ZFZUYZ~ײ)j%%8P0vY4|m P&HF < $9ۍ\Cw v)_3x4Mh߆\2"QiU5rւۻF m~Oޏ{`WZқno ߛn^'([xP-5P\i= "B5|nl 3 zVHCkKFDPl:1[E)BLVhGFqAdxᢳjLd.H+ =N+; t4 <[v/q_TΫF(W_~~3'Hƌ {"'SǽX)ZqQ$Hs(EEo_Y|m}F[({'o(ǝ6vvti=::_ҭ}/lsK6A0ɵ r B($< ҂7t"i>.]}of'1:!Wm}_z]_v ޓ@IWe/cŒYW8 KQ[| 1Ǒ !jv_Ql}{_Aԯj-_E *#u!E'ó$I1TZQ`j@"\JlB Oc*R;,<a8z<+Zj9֦ C@z!|e{=U@pNH~Y74P` mQS6}[@pL&G/6**b)7{rIoDr+ u*vC\ T9#A ,sr`Rq.J²aI}o- xO$4`ː܂4,7>F'Y.XzNLQlDZ_.jp`w.2۶Z矯sgHtgsOt11Cxo`0jg6jv1o \yARfKjV`s,Gv'sd) vsaB;]JknIkO [&*g-8(xmxY?r 6&TFudXDcZƏZ )$tpZ=>~<;IRI4joE6Ls`lH"z 0PgoXA=C~ u˻-RbxopiAH!&T5d|`E H;rbPD# ,AC׷4!@}sBe~6SOoey]}7n聜M4}qO1%҂O 6t:ͯl|-/9Ayp .=6F@a@UH"a +uR$)z!+m8_e̪C='ɃFX)K4,z^ڡD펀JfOMwUj`T9Fd*("`"RSFDDdh#2&"1;ָ|2TJiα͸O+@ fřiڼ\U*;(͕gabϜc LZlg&ɍmSǦeQ~Lca[ϘbUS$yhvܨ1S;ݺRVDDދ;swfGv< +F'PMa%j'Eb1ebCz 9,5D xeY^\XT:z5@\v=mW(6"E49O0Qc,^R|0(oqpF%t6/RR]38zW;mgǸu΁0ʪI}6keCKAEХ~;(fӧ3/3Dh"Ģ,bkd.%RR89}"y)Dh/-;SBI!180-0r;Jsqp*z " +-kk,Fи}U5HmHc:ȘhlW]|l !`8#1S$ c&YX cJF}`Jv"O\Zm SmAܥՆ{u.RV;5gCrLZّ8{Y@y;J0hoe+\s|9@'UXYBJԊYkcQ`)FhC<0 EX*C Ʃ J=;qM8*Q20ekԵ]&vw[Y>]?u^Բ-k39:)*Ë.b)%>5 \kib+,1NxmdmnA"U8 ȣ;"E Rr ̛TCPiƝVc : (O n Xy,*EHQ%ӻa=w q8> 0*@YXh0e*_<\/u4"H)B9-@74y8M E#EdAjVtYzAXNq9-S^I4`aɕH6N#9)@pmGmwtk#+q0}=d"$Jg3`O*h8#s}]%uYï3دt~.wgwʃzݦ_jD;`D.Y哺7ΤX~jjA,I!~4糇#_>zGD `sYO6*tYՐ3~J¨Oyp)y\oÏ 0My eٰQ$1ĩU;.鏿i=GUZ#Ζv(s,T;.Ү_ბG6~%pbo@# -[%H--MNfBnƅ8 t=|Ǐ*6d[!}O<ꑊ)GUU36薧c}=^y/o}` ~ߝ,muB%<$Qix`,(3xn23Ń[ +uN :km?UdGy w^ $k] cC$ڒ+𯃃S5*2bpkGjd~sVk/;"Pv:#DE/7tHDY:~5]WD}e䶂glޕGGF|X~*^o|6 IKZ3|%gS|pU*? $s9ˆ-!eyy hΆƎYo+I6"~s7vͻЗ V-F]e3GP>wYmXd_ym}7}G-ΝD7,_7)[R8츆u`GUQ{1v=z-HiXR30-QH[Cθ0b~Q`<W>\bbg;qiՖŅr#Si(Hx7P_JrR#vqWN8KKOɽfк2VI?_q4*r3"`0Q!J(i":Wacd*SρyQӘG0L⁅@QAFG F^T 5 T!ӱwF~ ފ9dzuf.GԮR*;޲|%˒ BUFfdh $^Tb.Du,sfΕt1WvT޶Lv1[4F(F;&i^bx2C')*(ӫb x5 C}߿JUnt-}:Naɩ`|!DɭV&zaeeSJgQ9hW&Ƹ\~1) ੢0|&+CPnv|7B4WN3tQ{ }VʚL2 %#Y5BI3WmwyP}0JMKPn>[ !.slF|n]Ύ1K҅c+ᩌm)1f &OMrj\4:]~ i^UKzeeVlX^%LNҊ/$'fZg1uq V=O!9.?Ϋѻ{&]gj6=>jn=|]l=6j_k/Z>!< J;K^}u[댞鍧Qg7C2grKϓi{(eU?Li{%Og =w=mW b2FERb8gE$ܸ䀣6WܩgĀz;tyCC\H"e(B'AKb\Xt?{ɭ\@=kUE,$71\K,K IGCHm5ڞu:u,)Z2?_O@gP^WN^w|NQÒmӢӨ)XjiS H2]xc {ޢMDGXt%,vT[rŖARe2wy8P-!:OTn/tmOO7m 0nVT~xa!)LIXѪ)I-EpX?ܫMr>ܼz ֞QIHZ;L5n`L*]MjBFJ g3ad̆ySό ?)~6`k 9}e쭽7,'W9bR\crֺ$ر]{N͏%G pYjQ+/^h^.KJ C $kvFcw}/rB{6sF|~Vw$桠lT3vxEW{ǃi6$ zЯ9jAj,4CZQѻ@͊]|U,HZ.LyלHFƱo&ڥQ 9wyW~'a lTDԙQ_xz)bmo={'rcCAM: '9]s:66jڸ K5grEm@I;rya3prr:gb+.5 )GYkbζJX >D.dKt&+.~ \<<;dCz鍟f\"^V[YvW?~:9]Xo[}sLg5v=%cn,iNXy"-گ6r;Rr#ݧkD C (ߏƢ)}=p ˝KclV)xp\ڤ![X  ;e^ `H'͒"1վER1獬#;>!"oKo<,b2KdCCӟSy={#2ӫ~R\M/'/awL=M':*wxf: $=cB9bѢ7ݧh{iOG//5^=:+`|tH{0F k 7†imq͂OGKmCW Sz-/Qh?f/R>jѦg{ސsN+=?<"{kLD{:G z?sfa1FO@ܿݬ1ޞj~\~]|p;87{+uxre 8*JV{4ַE0ۇ.9^h(On7N.vpўgntwϲvW~kx<+i vu]dU 颤;7\_}?2GbG_l*n¿ DŅ/ [Z?gXnnk]wu|=}y>z xtPQNK= _,~ǀ@}JPx{[@* :~-kAHHc3~Ԙo6ymH˫<{9ugfW[V'7Kzw?\xw?[ϝ*Q@OL[1gh~D]mź5 q;Nwm9uh <GXy|-Qx8x|'Ⱦ}<5 x;?YbTӷ]ؒcZzVϑ=l&o5Y_l"G" '$1?d}h?\"h4[~i͸'}-Ojmk.'zŘh`7cr5(:kYl2Bl%:öJTng|P.VsB IuPIH9WW*q}{0էcNuGhXswgyri-Ni]#c30)_]ȇlM4l2QԌ1"&ZwOKP(чay oR|DP.Wնp1Mc/9%C ~\{ x3sd^_O:d[bv\m@=`Cz0^7 0.9sPa "&?`B4 4J(5{H< 3pUcc^ 0^7&AtSzJF2Υ o:fxyJƜko:ICjqs;`Z5z9ӐJJm:E5{\6'gl 0^G싳QtnEk;ɦFE)Ibi!'s|J^L6BY Q4\jH)F!pڜ`5h*OxҬyP)ljH:evLHёzSPcPؚ1sËOuXg!8dGoO*d ٥fۑ#u&ˌOhB4e @9 xF,WmB(;.up;OmqՔn`8*`yUѕx sLtq ꆢ/yX⊲ΊsTWX#Ii^]9'lU)]*jpWO JM,.]T ȄW'{ŝlG sUfl2O%ez*(_Wh,v6L& J2zja$>೭ ݥ! 0n`Ɛ)n0e( i,H8@ Lh@;[߹{f)UKw5gxƆq Y ݄1?%3|ơ8LXycJA4 ~7ud&zXiA{sb %Ra7i,T \!=;YFwKdY%))Em4 F4G3\cPH/gN>Eje" ܜ&dYr>2H 贅 T+Ѯ>i4T$>b_X{݋~icƬsDr"p plapBGSI@ ;vݙ,. WRscՂ׬QBt6]#P G2<,4&Dy:Yɴ:0ҕkDRe1G=GNJY ՜@H@d"UFuQ5;n 32]6h^= E f@Y&"2:vdn.g,T]ɂN5~d}gyQ7q' m0tY+ '1 O6Ehvqȓ$AsS]F)Vi,{إ )Gnz1΁6D2D5.UP z.a4Bc,9;k r[bH A3dnMѐҌؓ'EA9Hj"C|4kP"˕N fR+1#K @O:Sc@͛ƾP|⏬';lE^qH"rm6Xe@᚟rVAʟuWCW VNypnȢa$qn,QC=xLgX?k~8ژJec4Mi< IA7 ri6@ 4֬Y 6kj`EkI 69!#Z yC?)`{֓EwJ{NSMklZ؝ k^U-g Zw 9 XfV'=`er㽐Y?n D*ӡרBzkmXe,`gby`$OpH"}Ç(jSV%q=5UէTC$#`nysxApiYX-WurY˥@MfGO4Yhf!{R('1"RL%wJᝈFi*[ T(W~<ыl-5\%<_%O7YԠJf/CI:9ҿ(ff*~%~i3TJSJɳq|.< }<"Z  ؆< !y@B< !y@B< !y@B< !y@B< !y@B< !y@BЋ1-}N< Xqz6< VكZ#%fZ!y@B< !y@B< !y@B< !y@B< !y@B< !y@BRy@A{N< ؘq}6< -)=xP"%J`f< !y@B< !y@B< !y@B< !y@B< !y@B< !y@z< U}>< גg߇)@KI< !y@B< !y@B< !y@B< !y@B< !y@B< !y@/t+\˽<iϣjZ޺MO]: !̳0S@OC|w6 |qZUPSfxTOYgYjW\(be w6xY8O] i!SyˈO2x6 eوz:g2HaBVpUnpy uFH@r~i\0˼"B_ gU )%lbBG,A>Fv\IGDEE `d Ly(C|i>YZ=jb$VKQc_[2foqAXiv{'d$iꕂ;SM5עy°/<+Rt矾[!:ZS `W!"* . 'u/fCtL PG.ƣz#[o`$E|y2@+o JC'\`yuI2`rEnÁu4uࡋ:E6D7mys}<߭ .d1%J.kK0;18ާ o~?俍fqD_61~6Ʊ?Mfa n_ |ߎeHo {H}zYV$R9fS?a7h__h6tn&z|r||HK}03HKH2K3دɼ$,xB$enz%~bqRqY+0M>qF)1)F2L PYY B0JQn ka,|kUќv¤l?̃Ѡtk&C[EucNf0 .S:TޟתU}okFŇ17%\_~ZuU,5L鲢QzUZkW7zew_WEWW҇E^x8&g ,|/1zs<]¯=.ix'ݞW~K.7zڞZzd3!u A[dXtKZu6JM%ڥ*%5 'M50"@hzԶr=AFoN!h@<gx<VW.>u:\}T ]0sxu#=Z@PuPݢm٢ q5[4+)Uas+CD.љਘkCP;NX}ݼى5.gCnzlg-\NY\ck2 G|lk|tQB-s63 aD &$H$:Uv˹bWb& yld) -e\Kq[R"VbkʃdNqD(mIK9 tXRmWbel̾pc2[%N N#L),@A%פ6vH66m6_x<7o_C-; jl~{.|k5kkë!_c2ˮlv tc=xfAgvE3Gwɞf'Ǜ.{QkU{b-.e_ jV8flߝ{FO9ݳ=eygRMb*yX2i HIٻvyOFKKop-6ZgCD#Oe 3HSX}&HDt'JQ S@h P]T*U BxMp94؉i3 s}sͥ !kMXXp^c/ b`N/K9ˁxITGɄp$ wi !x]4.>46'. }pV_ r-k옳u=F r7;t*f(rۖgpanœy9Z0Ş= xÑ!jbaÁ/n/(/۞rnPNj^ml}>ysYsR\;%Q1yn;.x9.Ro-GB%ҠSWN 6.Ѳ`e.I+|`|b%u-/5>$>O15㱞s ,1(*: ,f#y9*GWAӏO]F׃sjfIGK>*j{%{p۳.\8;#Xysk\;uq/:nU˻~CnUˡ碔&yJbn6 nG6'av[79Oa|@>X̦aٗ9dMjE]+)I>һf5XUeK"Œ 8(bΕ fyE^?sv8%IˍyLלskW>Wz B3 g"PL]m#ѵhD"iМ6C#BcODHK' 'aU$kM4WXQpJf˖I-͎ܽH F:0He?2謏 } F9ᔕ gۛ8$ݷ~Bz >1/oc`yԪ߶WK_ Zz^0E xSUKI]̆py F.ƣz}+=[KU>֖8O |9'=JUnՏ(^s)U7eQg/틇eeGdemoνBoVE(:/~h\(GuN.&b9lX2]oIWiNdva\ԃxǑO?V^~HR,; 0YfWcHPyZ)ѧI:3m]f2فlG DYe۷5zlwJ(TW[us Et94'FJzGY"`"HJAJar-Q{*HFt} @@^"и|ުO}qaקVs옘ٮtYa|lnE4JXhK+ҧm2}Z9ڣ61jȮA=ZC\vibe[҃eXF0o~yޛ;ϳyv4Ϗ.˫x:h n Ju5Qַ/I~5/WyYQ&+8+Z9gjj\fTug#jH(+?:B[ړ:@R剁ɤIZ l@ CI ΉT!((6dQVkI9 ϤYiUD4Q)C6SWP!2XȺ1Fn&Ζ$uM_=}]%9e|ubW7'oN7_ݸ|=绳Lˍ++^tܪ1WjmbEB,Δ\d^EZ V% 6<2!h=Pt^+_tjZZ8xϨlNYUv찖BA$k<:>!"Ye8ZAX{&јwXQ1:&aBi6 R (4dR$?2NGmVK64Jet` ,ypRgLFUlhEfKph6`tt4ƆK<2yO+ȌC@CEjD`x4fˈ$ r# r>42 CI,Hȴ"NisNN=}Y竮 Wk, 6ë5ĪA̢/OpbRw˼֍Ż?4f׈?]z;3K[< C#wb!1?3l:FjiVo4ؗƍv:aq!|N"t{诳il|dB˳頵p;ϣYѬpϫz:;_TNNjf_CMm*mu֯Bެʿ57n(2ŏލN+kkG|q~+z7=T%zwngY V2:i'(a|~5^<lrkWB:RCɫ~:qğaH |̤-3{_{|] 81$֟a (o1<'uJ!ͺ }Hߞcd[s])pqݞṙ^"_IHTκ9`nt>L?^]qX_IhLmZ{ϳeIݷZLrꞧسY[oS ya?]堝_]IjTڧx^jlXm6"-E&}5NPRdvx{:pr:[<9.KEIP!*`=@$-O:Hc̑R _ym栗:f:sv~Kn="&:fp=Ͷ&4OAz|߆9ICZNkaHhh)Bd]m+e wJkrvឋ49*%$,R0) nC1*A:|֦d% cSDCP( K(*Ӓ?}I‚z1J ccL.m6Ww|}ͻy['Siy-v75:'AG 3ro(cbV@zpq#w gg؋ƮHV"w%D2s,U I$ң܊RZxPgohy07r*w_eH{6#-{񧫳;8tH?xt>j4k<{)v;nꇹN7,RSkgRe*YBVUD=VpjRzTO5߬N˭gz??O)źE(icCZ'Z)aeSzrg!VL5p8}~;؞$K`^ T>jƨh$V E"JL)`& O!5E3xGt(b`k׻onRs#/P~^he¤REBݠO!X%*WS~XR: `1L pNJ lT%؂ߋ^O/˺űgk_uZlmDWS1Km,o>_i6\󌻋jώN_mQT8^$t t:zߡKCd-u5ƅ-#y4P4Nd1˼RPMREr/[.*-/~=jݷo-V湴.e)l(QK*%{K&eYj&[;f (5I#`̳sLdZ;^996ojw=ssͼ]J ￙&} yJg]໠tRf%zLo=8>;Ҟr Y0`nРrFU*D 7:T"ao jQ H9jP6& /wTJ ^o"eif4(9Vq5Œ OZAYX]lG^LL0.xkogMo-O{<ӿ q"mq:nU [ /2EӾtz:Iݞ|3.N+tkro.>LB<*B/BR.z  |4}e]yUίX'2Sz7 ˬ咹<(XSsb}>%+=zCw >Sz>_qp9AaCg@ԓ?|ҭ58K pTv1C(ˊ nWbdè'ʵ0ʼ{i .orH!9nFvfMCKata11Kg)1>Xä\|L%ٮQ7oqެiNU~ kUrMemy_;c[ Q9n<ۆK׫lnS/W+͗n!;>zv7<]p_p"e_v޹ltʫm]6CMCXEgF!oWigև>p[ek[OB5??u*C[~-ϋ1MH}cW0HEڔՈ{

ĸ^8`@D I 'j RYBȎBsXbAZɶ-sQ `.ibĄ́2ƨѡjglUf/}=½os.*2^92=2MpBίN/c$i%d06A֔]orA\@"kLT[,PP^Rm*Ϯ0d-H+n&; <L;vڮvG}D4!0.s`~e!ŐLIi#6%`d ;R C&qMa1$4ȱ $)M"1fٺr@3q=IPq7; ^81'`(\'֩5;%:`mKvkhI"UĄYƒ"b,)!LmEJنZ&R#aBx)$\4:8,!6gD'{4Dח’y҄+wPwzMq5J)C|@#):)*Rxf;,hg%3EB SQ֤P+ dUȪʒ(@}]D L&;!^"9煲)bM C^YUc;k&Ζv/7(*U1Hd!XdYe\1**Oe$ch-MKO?@IJ$ʫʈSAI,&8q$IV<;#OZao#@ٻ޸W E@56K !Ԓe;;OR␼1 sguNr?e~fgF닫~0##XP)ud8&IW}yl8ObP['_lrDZ/n^.AO3so< C;ڕdtĿU5Vٻ.d?w:]/K}uqֵ.fvvwAڡˢbۻ2Tgxf_u/QJPlv! ⽷e)^.oM%^Xݹ~vv95T[0[mHQh\5^lH3!-땨OnŝW?]4ت#oϽ~.5L95ZsT7|>Ն[(Rb^.6ܛ.5 N=ݫUQ<f/f׋[  X79n(H({,\R6~=9Ews ˎϮ}}[L|J|bM feH`r %ہj7dy/̉LRǽX >@o(SʰCL>@75} 'чP|~AF69 n9:4 +4s3E'NH[aQh(qiC|5G;:>_L>Y@8XWawDr{Q=H1Q@tcpHφT>Ȳ` _d`R#ibQH( Ј-)*; xGγ8Re7huiG]6kowm) nZfO;1昼h]oi"y6Aim +B/CyA+Oϒ<ȎKlCr~2lJ&O`ͶvͿ<Ť$JL@C-_۔tDJ.ף;F6ggKÆ//xGk';Ek3QWƢ < 9{~@xqqRwl%` vsdo[̀RQo>Gfm|d=hD6R?RYf,x_~^0,+Z^K9Ԁ..W-RV*P%H-9-N <[I6ʰ Z y XS0F Jjq+[c0Gz塎/~yo8KKl]3H9) p2I0$VZIc;}"k0clƆS֭D%C)Fe]q_{x4bj884TAwacן>)&]JGM)5 F!wQ> C# Қw-8 {E /D"vOQWH#& L?7Y;3ll!Y/OȒ!Ec}Jk粸X QGU&jj %X3TE)ɑ Hr-PZQ́L`k D]G8: FяkaU6Y킙RXeSFVCIqZ/$Tm#mSq%""OZ3^TqSVM٪\ wh+!1K‹eTؾ0ȒQB4JEҕR5 SkQ Ki) 1TudqzX DW !$sR؆qBނf +pW O%AƊՐ(ɐLt_Š6&~AN[*F@8NVj R|Mp&%JEٛ\ɤ#+!yx)(`~C[HPSI NؕM()? s?YOJ8#Na8P6ʲ>tj!j_CXku@<[g=n/ R(&**&ʼn:cHQYBu*whpu܅ &"~L0-|g7փ |^WGRsר-Qw[x2AsmZ$^#xHu@Ky@91(}r,}V2z ]4 -*``>qL΢&qaIN kP$RDM+ʫ3 V8Lta\XqѴyT~5: _Sd@f@~pt>&,XNU~T}*y*Ψ*!m+dYj8x:RrUq䪢q>e-`I~ZVoR_Q"G])qŤ<$J,xWR[o0D7k)2 h#=II9h@;qJuXQSQ>&U%QM t"’ КKdi~DX3,vLkքfc R"LVTEK,q.l HX H2yDŽgUB S)^ 6 `vByyq|xL{uzq^9vfU) nf4GKf Ɣ$BM>P|vT9YXZaYg-GH(Y-:`4vfoc@ &tE >D6@6%TP2pyZ܀D;4(QsmVW1`6/yٜAX;#L[lG%JL"b_$dpIBsz*P=zHHT`z%=&@)g҃1VBW+ɰ,X Q!b?Я0D1QgmT@Hv&rڷW5:GVbra#/3(@aH.|򨒌 V0^cD x܏="mg! GKr М\@s8r=ZX=(`(Ke$i1iC UCx pͽtUB?_w 䘙 }`+o2J BDr`V:p2BuXB۪ ,Ft²R ZEvAt vԃ ݡ@w%XZ5dxԓQ25)[b~9< \4mHUg]t>pU>vb"40e| a,Ts #C ;xq18h-`kt(';p]% ND4DoQ= ^e :Z.IZN*c) ¿ 3&QmcMfEfɼ{{ky/1͛$K\`t.{|YyZ=hb$6VKcP[J>o%o4U͖aIm3i ypV5k,wh2!Ӛ"oBpV4=G7EՆ&> ֯Gi1৞kmB]y-5%Et2_\ IhњH9&_ vM;bO~nwyb4ݪoAA7ѥɸc>'a|I<֢ZJ4GȬKU,qz]֣ҪQt.TWT%:7JmXVZf _Tև~Z{v!?@Lyz˗; wu\e:iJG#AJvz d.W>y}۫E,zZi6g\9_۲6 wѽ06뫫)) Q]#"!גy2dS ԗ JIXC7-z*1}es_zm0O>ԓK8.J6&-Ҳa9Ue$}YdΌU"GimYo%䋭>/k~ b+MmoLˮAݲˇ!/32u7Nt糶?MfӰJ>I#{* heyחO>N]L+B.tN xk\TB}9SF9>Æ/bs?҇AZdca SgBYbQm*=p ySv`֠[A`_ٗ [$..huj(i&㦗n-h4M@c6|$}]0~+-bսߖvq[ 4}0(^s\Wnf5g\&0ߦĢ-7r@MM #L ~iJu[ kkhlf{yW P6^:dXt|w״Xܹ2cR'm<B.A[/64puW_nW/^xLkvW6l:o=}?_x7pr@ɽ+t2Se,:~! ?ͳ+{o24>noTQO]s//_^,\l^{J0M`\; a٩ݰqiM]6RϿʼLtuUgx7]w7]I|E޷axW3e_@%ܼ9m ?,>I|sHyG88H Fm5Fek f.8cjBYv9>88bN󙵬"V'Ӊ Y`TBZZMJ3N ^p)}T&OΘɯљ|}Ĝɷ:P>fymCIDU $(*yeٕzj<-6{Tlyef 4M;}#dc8R9d(uOr8k6;~|)` .Zv|6Y,E 2&LSZ eFCUy>2*ba585ʱPa,I!MØJ`X?NrD41"x!HJs*[%&IcI SSW:91Q̸7,c84-swd [9,tz1kWxs_,1[ è^ɽ«iIq/H|u-X_n/tIN*%ZCp)D.X(VFC$,yM̄%-2q-Z]n!`lgW 2!rt5yčJȳRTH);]XЫ9] 4Q'9*|WVqPBI>3o#\꨼KZqm9@:OgN lߊw8حq >) -n- $ >dE"12%d"zR+G'ajE$ ·!oLd$DT!xCCG1KK6'Mn:|pUދ$ Iyze4&MPyj悆ZRlmFDi?F(UsSkh>R= J8©J"rF7LXXf:Y6 Im:XF?| GcE3W[$u[$ h\u4 n{1\Ҵvǻ(=c%J+$:\lJMʘأ4^38p-mݕ ^W<ݵz~v;\~H_}--]wqS<uYUv)arc( Dh, rm R&i࿂6*b"AlT҃I$;屻on6)Z;r[g&x oҸt#Ԫw%Q~SUy?4_J#ý>/A} wiQ⾒UsKP]8\Cq\V<:XcckAMwɸ>;u"b)mzsd+J n e^(eNɁ&֖ 2eV1'Xr'j"Qwg~Crih)l9B7卿y;QÏ5g{Wi ffACޑ?cVF>$5gZ@2n]"*RP d%ifupc`:wkW;Aۏ|DAc>@;0/d(Zn|t<`RF 6jSrC yUSe8bz1ptv[DNh"RF-BB#N Aa(#Z84g\0lX7ˁxNzcq܀@RQk"p:4h6PXFo<~o!͋"Su_+Ӣx^@{W_+=ݻ޻]^do;ߢ/Uoϧ㑿ZYQe{1M3{mepex7^,ķd[[EyyDZ8(q.Pg=qP8tGE8A\eTI( :a* gd?ՇqcoaqRc_P/~"N7qQ~aFQ 8[4鿨ؿ{k{o=5 X8eyݣ,ztqh5NL/e(#9? _Wsgؔ_* 1>hTtIљ W2 l߷BgJ1#x"{ b:̯GU_8<|tW.(5(-4#TA_{ߝrCaPHV 0sN/5deGsSw.ܬG4\aeҜ?O]J%%bIYE!v[;ɽ-E{%k?&b~:`:76|| 9µR1c17-o;1'eN9gy 3=)DH% qZLoX%Pt MR`cJI`\qYmF"=2 F#շJi2FR,5\kCJJ>pH.2 @H XcA<9Js.77w ˬImKdΉf@&'ԺMw.8zU^4aaM:UEsV4.fhp+E㻻h#ij')׋R~şpP=,Qй4clBfv6!vcvAOI31$1Ryq1>?{ U?ۈ/bQ\5R;Oih$3$o[4s#d\D;5g?BzwUJ%M$_ ^f(IZĿ[>|C[em 9*-NxD8)@7Z[Jg-I;j!5d&eg&]}32@AH5 = "6"j\2q L g@.*)cx OF#O9Q%a䆽0v)ᆆڟtw702ܻV8ާ.wO%2ݽe L}¤:'$[|^挋>fG.dli3s 2@srwX>kؖ;7# <!ȢTRC9O'DKl򤀢M2(w U[US@o_^O>ȉl$ >B`{צreK_|G@ˣӊ(KHQr$ ~/lʝ9:Yn@,InMg0=biپ{8J`$J zS I F#T.HJzP<(l~ճV s<g]&cUYP)"E@(+n"$y|.+fӋpq8xa^ *53M"³H#]üS`gH( ZJ'Mw0mb>2z'-{@l DPYFe53W4 2bƆQuL2XU]]B?O/NCqO Ў|HϗBӍ|֙}) A>ɯ{?[ТqC*ZrB|}=CCDv>gջuI$IDqͤA}:i7=wru$5i{){;-kv 4G>j̚#ߋ*/5gZV-Ҭ8*)9p)ѣbv/`&PzftV f;E֦6$S O'^7e}*YazjRBWo !8=_h?X52TBRIUoXL69ۧ߯Ƿ9n9NN)KJ;@\f'f]*{7Imb2zLlUrEޱ٤zmK;8.[mpM62I~2*hY3@m{ΰMx=5jDWu|۫{^ԙ .;RVLDދ;wn̥_xFFe2ÈE$ιRNZo-JN !@TeY^ة ik/@v=cIЊj=XTzДH}YR@|>:8~Pz(f:)kԮd76ʮw=6z17 :- BД1{=*Q5]5w-=BSe5_4R)C v@hшm) y#Zhl>14=UL&`%`$uM%#xAրfç1^}!@*;^}y>*)ޞB9%9o~:MBCP +oCZu$D>g UrU.%]LNOg,@4X$CϥK:./pܨۤ^i6([H1߉ZAix Nq >N NmhV.kAZwz!Kjv*QZ\deHA%^r$A¥D6XYʢvǏ-_%5uBC*IR'Ir1 P2'C &VC/PTo}lޚi֣Ei%A3SNI(QhN w6' "kP<\+7J?rer,&JF`sѼɁprM mF+:(n<(;AS)G>'q)B>Z[X3{:1@TN8.~n>oR_[GEģ9OQ=fy(?!(8N.6),]Y,NMPtڿ4Y oݨ]Ӱh'Ÿ"@џi]gy0B@l>mx-nQ&y|*Epy24-s.}Vյ7_D5>Zݕϛ:9I{ڃ-ǍGoշO/F8 e5:A߀ڀK֍(")4 XIE@4x;~\ոOi= ^@kOÜՐJ6!]UO ~ // XAXr&3ri7:;߬snG-6wyK Nee*2L%L-z\BSIeMU,jszt8Qm=ɓ/BR)s&"k(TYJ LA |4) O@'">5N! _*W$1(D%H_I%Hw&k'yd1#!d+!Iso[CWTlU>{o慏EWh6$!dFE[W`9eJR[rLo@`'% (PlM:OZ0| ڨ<2S Ew!}{u씨.nx³3W'nn~'Soƫy-`;`mi}@G_ Yţxލ1P@jf,Y捄ͦgC$ Nj)Fdf-^D'vC%Z__tױU׵ߺsLfAa`(|;6~& ׷M, ץi.b}s>^T]OjJx9 vQl}ΣKl~O;VX_Ryy ;u$]xu6pBV&A݃V58ug(=@fOhdxj0E=Yi?4ʍ(U8闷1u;whV#He8F [/{Y$,\j" I J/g΄SK;tNx%_n?})2.{'zZa g=ILe&dD gB0fюVԥyyQByIPTf$bAą3ɖ5I^6\BL*ȹ߂oY߀O'Y:[\[]]T+W,$PʰqC$%)vdt#CsV?䖏ԋx/vsb7;[dp縠I*M,)HSB^)Pӎ*u\)!sK-?bįeAo}4y!w:5TX]/ggu# ͤp'n\J"՛ B8骚SgTbr%guIfxd"!]=1I1y=]^K%_ my҇f5`[({UNW;AR9W:Կ׺8o7WuҴ*O,PU[3Z5ofEc'xA47eB*}E*ܻ{vqa[_0 R?L6xאXϏdNHGG\#/e&Gjᑀ=Hʆ҃Ůs!Y}b_El9Z4q5jʽf2pVyT"D %k{=&I(YHv׬qs~C~qRthP3ґE—]VUɬ5BrQ:b?(Z((1Vhٚ킌~dxy6\ y>AҮ]ȫ^`1>ZbJ&Jȃ$E3Ybw7"J.֝Ш?ZKbv MInj@߫)\&zr\ܵnV?Q/LI,.˔Ś ڃoW_,r$$8]f8֎e/?f,V l涓j¬Gu84M!,9q"0:ծ WѣZ[R"y}~Z5; *OgE鋻>!kkVV'+%Th bqL;RяUW,4|~bn׭m؆ԎfC~_=y2 &V2%dghXOd!30TD9fD1)0?PcNN炛1#LlĘt_XO'k{m$YEq/e6fi}.=4*XxUfh 4 drSX f?hUG[Mcf7P:w;=RY{~)* goS|zVczr͘{S*!z:њ/q/l}oM$AɞGhԞHDJ *VI" Fۢzs[T*H@͉EnS$Fd@)J;MXvd\`QB˭ N-7v՗/']KI}$J]nJ?:2ޟzm~Hh.v S!sEa Zrz}^.DS1r19cc GNk ]/v:܈P7 f*ɾL|lq}LνtFr%bqȉǜ0*)Y_}[ۦI^[3Q&a2-UTڡjVϏp1A_~S|Hp_D5>o&/O/>rFdO8?~0j~\j^aTR?X^C +te~I΢INGΐ> _.dLk>$^F}oqO /$\Fÿ՚^ءEH--vUWggf/ZF{svt6}[{(m!XmFtAVܪrn+ǶΖ](e]8}Nj g>*b{_3讆j^Fد߭lubᶫ}"tہf%nq]REXȒʻzbX:p7`;鉔z<4fG8r=>ԍ_W}di}#XocdL2<:I#Ty+'hP2+ qK?ӏ_ӫO!cJڷFmymx{WzL|jY|:\ˎB,*ZLt Mu W+/#HC5RM2@&!^S !us"0ɤvG"g3vTdMra@vt 6IIzJfZK  14)qa2ΔH]rgW3I-pkT1U.,e!˒PJ,HJ&-LTDH6 PwS]$b̌) m2(>Z "dl9tԪ|h eVGIe5[>R ܆̚2?E`"0Ǩ!>,CV hX.lt6k'J3bR.a ԢZG@=NF%Vl?4;?s!]gm\,HNsPTY[-;8ՆhvI)ov@UIits$!G#M2d䲥1B$7}ɨ5C!ZR2mBpԌD.4𥀆 <6sY΄cjʔ ZE''(4.#$jgh F2Qhg7f6Pd9 ZD@ #K !!:hK H]q$A)U(*VwkPVSII!PM@t4[hspUST_@s(eyX"5Ұ: cޓ`@,@ FKJ8"D X<N=$Z YFNx2h*X.I*L ҄;Rw j$  Jhf4><eK:s&Rn T<`vQ 6!2)!q2F0v[YJ$H0,d@k8RUYd%  y3 I,l~D j`&16\1t1A$C9G E@gRv4 $VQ rTUroM@ J8 92b\'jdٺ4 Ks2h)R6O1AAOGBuvX:;**@DU;͊qn[&sJ $'a Omwۦx}uٵq'SaA[w56}m64@ |tm^~.-Ũ<$ܰUs xW΀Ut FYuDh/Aʀvo1!5Kp VhGc ;f20jxrI'#.c09GŌ: hX%Y֨48>p@Zi298ks6FuF 39(k$`?Aj؁0"ep% ~` Ü@#܁?U2ӉTIXsA. -, f,yՠ6*gUnKoĽ";l'y공f&= H/4$oYBeF*Yu0q nu0i1ow]LgesMc&T`ڢ+Z`8FL-FTaM>;5,zSЬ!碦r8RV  YjBL9<hݽ=7fܵ9p7 JxKdM:39P.#dskh+Gr).2PP<*tGdi'O H@voܬȰn$>?=LW؊"D W}\'WN"0Fu/V0 .TmBF8RŢ4xT##>j ޝATXD 4.1#dnІEB?zV"5R&nV xzԬT gF x;|;byP`ҧ"pՔq'k[HrI"VŪR盉`ebBv"J&J$"Ȱ;tbɱ.klץ5HxXD3[RV7q>?:%.vEۡJJH!E)]rzۿoxɬjtKpJ2;gQn|' +p%g[G:<{P+K/FZԳ-Y?逾 oK4"HD: t@"HD: t@"HD: t@"HD: t@"HD: t@"HD: }: O/_Mk{: Xu3逾`IC: t@"HD: t@"HD: t@"HD: t@"HD: t@"HD: t@"Z  [z1:f.w/EԬ^ԬԌt@_w`%iAc!?MMA5kSw SкvoD 8>~`8]%?]*˿Geþ)[j?-79цouHJgn - JLm,R]+3 Xkh%+pc͠bL?kOlN/KQ bHm߷C7)-V>Ԓk? 6,w0?8)՞a'>}?uz{J _u9Y=;"ËAC ~ghEI)-|[OS )Z|Gi~#Շq~A,,Ej b]ʶQ6܊itR|M9kjA/οn} 7G Wmw,}z[Xe÷|7`3/-eOߚc, p}+1]ٻCkx>~2vv BDtx3:麬9"e"juV)DΧl2S$X8,eQ;^M}ȭhpkYKѺ([dܺ iΎ PcD79dYhW͒)<󋳴tՀhyzWwߺt;Y|\괯L\UE޷M@7BE@֡1Z[{|=t<}_fe;K`qն2MTd{B4YBWtېr`"xT?ZQϕ3rB6Э hhٰ ЊF5)^UŘOɪ >j[-Ln ;e4oS"L V*sF`4 5=e ZwY ˷i},H|dї#D mM9kZέ Gm'pۮvÊ_f f+Ք'eAI8}Bɻ޸E3V0v﫞7/%IW}gG-r^{v]/ek`s/1hZջ~SG}bʥ*m_K6oMoҫC(e`ѷn2O9k4ڲൃʍqCj:2OicOWI`#q@wUJ' xݭ_/&?/v~?z;._óӯ][L7-:wyn=/ȶjw=xFp^]`.l{ ^=js_zPl:nkx<]emX_:M'i ׇt&~ϝٿ;szY_ġ neͥ7!&^nDnzH˯g""owW0oO95"rB*[A[aqtqgUbSI;vVﳝR'>vNbWU^{o^o?oNr`~M1a]kzm ls[ Y/ExWK]vL5|7YWaJkS_[e9IV^zIW^z]tVQ -%HJֈjUW?贋}^y&IڶUuJ1Ba#mXpC}sʧ-"i>*ߖZxM>1Y :eg2mh7Z*AXm}qMݐGIc0"N͜qՀtrϨߞ ٺH[|?Ѵ;5kpHБGCGyp5@fYSTw~vz<HjDUQD$˝Zu9>$9Rhr1s'?_!~ nt<v<ŋܿ+|=t u!lm>|ˤ&AGe69u-=~Dϥy5\59/Ůyo~:iߴrfj(G[m/_ki\l5Q5tXj+^SRTOsCghmA~X|L㷱Ni:$JJfD6l)"S۫I緗-O J8mbl[\Q^*-3CRWR2vT;ovNo{ŵ1OϦF\Adj^<}r_+c1؉Ϸ?&<ۉ`Dx)T;8mA];BX&mgi8/n><D,lE)kɫ. WK Xyf#F'gyUuŎ/z˵O챕-1K^3-(\-ק'[û'XV!2'gR@. 搱W3~Oxo Qk xfkgl wA9 K P bς1C. cӖs-V1-8(eig>V3&LUWV|)GNQ~TmK ]c.:樸mRrrY3`XQh W+.VӪj>"dUU 獪Y{YN+eq(;+P( E3L"K>IvWp%LŲh).;3p8?k4㩷|[WyD i|/irufҖBvuAN9:hg4 T~đ?ߟVzb|{}a)* %]hw}S= rr6'4i~&'3|,eL{j B@:n;Aox-1^²33@ٕ#q~(Ar:OD(b=(<V!Bۗ[|Jzvbq<[ }kz=8}Hh,UරL+8QV;&5`ǧH$ ]'hBbSrRLfQY 3>rw݂y! >Ng' O") -SD ZR탞[Qô;GNP$T$c}E22W=lX/li]*HCBb\;p^L'X2E΍s9f֢Kea9FtEAP!!&-d Qε TmXm8-c=RVCml ` j vso/_\3=e^㓿hІhyt1rS\]H[.{%O.7^d.Duhe],+̊8{1 )ٔ&Q`ɷcgA>&',=d]m8-va<]M:ڶն6'H4rCUjI*-69H~,ޒV Wfm,iN I2䊼$ђSFp\F2YKTYI춇S?SP*[D7X"nx])i3h8jIE2 6VJf+nb.~GqƌUVl,`D)4hR"OZ(+*j6GW.̥SuVCl`|2NyA.WRgJU c&ƣ ZM]<]<{XM:յ@#/bU9nK):'C%]Phj$8fk*NU2iZc46'8/o[}b-;Ues.3Ԇ׃T0&r.\0ʪ&:f!KD`Y#fuAe&UcJGr5J>:VۨUv0pHt"zBUJsZ_.2|^ N^n)n|(~O6QgSΤV*AH>.8]AŜ-7:34J7hn5mO=;!`KG2%6IV2kB@(\. ,Iw3NܩnJxZϤIIdHCxy𲲞UΞzvr?TnSDX (SH GbуR[0152JQEDc$fP*pZi,n]@σNrD"DJSvR!Fr2SLq68IV?i012d C`{F30,R0$Ez ˨Qpr5x%ʛ!*+ Q&mXR7˪%ŢK=Zx1Fɝuj>h[zM)lt2gȡ++_ôzD{萌 Lr s֙pP^MeM rRZJ#bzS;0vRIȁڡY]{ ⪅o9]x'46dt@ C7J6˃dJɲa2zCNw7lyo ^oWK3CejB̯Y_Gv6a#M\I Cr -,IC"pwe&@rL[Ey5J>Nf Ů7y& ҈dgXZm.Ii~OCGGgTj\V%HM.eDGUEPH/!($u1T"#y9&-=y}2KKR^A_o17{dazvTi[v{{&':.{7)ͧ>lpRDVp\jUl[xBIk0wB(Bl=ZyI,IXi2h'd>+L!L8(m^r+\4/\'Co2Bj> JCL̙Z0#s y;&˯l~yՇViA*LA\N}`cwXx+ hv,*3Uɹ>onuYzL5b2j*`MֳyȖH+n2`UvCjo<~|Q]>ctI [y0K"`+AWx#H^>r̐a !3QM,SuLvjW[{^χusUj1xT&p1  0FB"/ICLR@)8!Ǐ[.'.A]ޥy&Fq0ܗ{:p.;<} | BK yɭ6Hc 㭏e@tih:#shF WWΞ![5y!㌣f)!+w tf9V9Oz(zϲ|>S,F^6y M_2pqeQhԼki/??ÏYuƓ_~lfWyה~ּ&lxyoʕԬ7zӲ]7e$o4o4+j&t-N72>U:m/}e۳pI-ÇEtxgW!ờioމnq+r jx3*ERpw9HKM@Hy֨RazA{N$5Lod>5OSo00h8a\ΚL " ~3Q~o C]K%ooiZi~]o7ڿo:@PpWoxFIvc oToVƫθ#V`'B2;3*lqt{6?sy#. Oh0d 5;?>pWnI.ү5k)!nFJ=^}BAЌs?-aɮB󋼾쯋H}O/8KOۏI)pkgݤ:C%d5@cI\c3x=D.袋!yA~ /?J:vJ:6%oе'w &UѶ hϴd8he~zBݳaҚ6zR} ڀ96DZ`j} gt[{X3JbJFOI8ZzVi121qڨڸ<o˃QII$w BdR hG5g2ԥ9 !)n2QҒ8wemItQu`lx%c<*uR@@fu7&&P%]Ϊ#3zT!$UHQ+NzW , {#gS&tb.D`HzYA31"$ȉ*NT! 5Iaƹ(%ҭ15 19 9ˍ x4!%r K )H%Ҫs9yB@M}ت콐e̐[>œ@%idzt3c`xa!E9/9Ⱦ9Q [cI"i[gN^3P b'<F)s @X{j82T9kG%׆7NcqTQ*:2,"1rbǏZ )$qYFӞ̪:[Czt`|'hۘ3؊@BhM8d|`E H;rbPD ܋: ۯ7'lMtY헛. I,r;\L%$!{w1Gwü3KSha'T73/.g?|/r,s鐞9ŕ̉3!)K@B& Zx>eBRFt5Wxx&]wWfR}v8Ic4 `LIٮsY} QIe=Ӕm$84UjsHsL3h4/hAxjlei "꫿굗ma+bb^֕#UfӽC>9HOw6XSW Ls4LPixC4gΠo'U&OKiE #/ .6~Rj|"b̶3-g5)u[dO'B5Ei8rgTʝSuc<&F0)͐RCrw,{Y,٩ Nikl_9{ $Q9l@Eh\$s`F 8D(+uJo?J(bYX ׻lzv/8D.c`b`ŭ#:F j'ŤZbι`1X>;`&- Z@k 9Zr;Af*uK3.\kYS(sW/5p̢a^ ?|^xCcc!ugIA2znB{s;.>1M 7#!t⹋xaҊ``匙kaA8]gwhՊpN*D|ĺI77XO9Fl0qۇҨ5'lYEs6SKRSY: J'`l_ %kDRNΨ Iek~܅k.Fylo@tc$ x ΐIL8RFAl?n1e=v?xeqD:$-h9I;%zNF\3md&tSYG--4qFۆHNňF׈D4[Y Ѥa˼ͨͿ*Q;=n  #  -uKGⱖdIi%7Kaʘ>c`zO`jhz*a݆B'Jd]w2~FU;qaFLgc)ևkıl6Nƕ?9.DKZKulY-M^72 DO1^;*0u@{^,?v2ಷk.ݝ) hyoz+VwצmYWs^dǘXMe9ڼb%$8iy :\rA4oۏlQzpUV-VF>|ɧN׮Goӏ?Lo'aGr3߾ٍ]P;7QY٬R"m.Q3ɐv,_(3ݻ(l^U%ዢ5|-Fك,\O~涥^NbZ^3³=뙒=H&;}M/+~aH[3>np;M=0ϵ щuzw&~\s\8^QV\)P:RFYnyPz*1zS%IJu̱HטmCB8a_>7Μl ݎ:7¨}Z; !L|(͏F_"gI)v|OWIX-l6M x:} Πl}Ҹ>/<-w0ɥ*`Djh}3o7fZtR7˚ Z-LC"5]?).{ſY¿$UUT%K>}0Rj/X3e3 !?r>kq=\Ե XXfurF77ai8/`߃M"ElDHM7X2ݜchOSy6|^RG5Ej鱙}8:{q>S6D{ *Lk PA׷oAm^p /1Ucv<1vw1Oθ<*kʼn=FWNbFk}I,TX)&B1A_ )^YΫe jJ[.[u!G, B4#G Ir)!F $4i[@yӃf(q^P)* |zPBG%J 0u|'+XZ>i UO~8=/v&a:?0nR|=pMI[1")1d .h!9rO; GJt6T\_g#GL<y)1hYL>!hhN8$|6)D Z)̠nZDd61;nLi&?eRJcj1P̷ǔDBSzp˜7AI l5M61D[/~coƹLѦZ>cB]0$ua7Vk J#AAnyϜ␜*&K[*>Kc0,_VwGgq6 ::=ss !,bP E>]Dډ#dc^hq:v|z+?~WDƠ>]:7LlhE;>w-hϸxi A(M+`-Ⱦ~o^^@]q+7b{#8 q%2yWlQK]Vs- @#t/׽;MA-WĽ'Q8oʛPg6ǿ{߾Iԫ?kKmo0ۯzo;ݟ;͘ҡ Rʣa QYYG#(-X\kQtRpw@l.Qd qUUZݑOܑW==*._u!Jd+APs pE1;+Ws/wߚ\yzSÈ05{7Z߸b-&r] ᰕJq|HT5NhI1O~w W *tjM68n\ )BI&ƸZL>qQϨ0M˻faJ\2siII%*J7+ET k y.`ZN> aRDRր aFCM6"V]h標O:hz 'ݺNe ".qT-R!TGol4lJ>SA=t{ Օ)@>|T6 ɯGh0=/ty܄I2LەA(jl 716!;too﯇YHZ^9x%qdXh2"8cU{DS M#bN+CC>3s EfS˝ZJ(zfٻ&m,W:4}@%L>Dt_K.VT*\~*J"@H::*I"dX+/9&,3ZWm}.,K"n&S880{R88.=:OJ/ 'Y+7.:k1]zqN|qt0ypc;:S6OX:~=ƬJ+< ,~_ Ͱ^ULB%~)&\{i"rx^F ld4Q.=lGY@RN<;k0acN fA=/..vI™ .3r3AbA<>c%0֗mʎ 8?P @Zm$c枇M4Td :|\g=E/<(QVbV)5L_.qcV)6*9jD z250'cɎ.yNqs5˂^t)bV||v9ƇYSLk YǍY';]>= 7ݾϡd=n3u1M'&Kx*: /@ U GI**!\D]STTB. T$ܥ10 S,,Tf焰JBVXQFZKn/Z1-¤q,if&ug`QIUbuQRwO;7&3c P ¥=Kc^Hf!/?ﯟ;?}V56?+P r>B|gj ;6rK@ +v/k\n/&,./% Hًv&qi_'Mab;ճ $>ۢKŖ_ =(COq%-dLٷ[Ɣ0@TntP,> YYbQŕ_NОSɝ^Kƾ`X;6 "DP$&Fr$:sA0ʂejzɊBhUJɸj,32@e+,A.t:ͫyzqY$։<ɱ !J8ؤ! yNi`"ワA(5 bu(VdKZ$|j|_\g$Oۻ?qsEQ(Pt̼F%@B7Q_{_EW *~ l^/ێIGv0cp|.&E$- #fIq8v3ۋzz|'jZqT؅m{\;JayU+c 랷߭CN :,\wNI3YV(HQ:wKNA.vZFC(ć!ETm[2M}y|mu_ET2uN-1GƂE~J@U+ȌT(T[oE%ŭ8Z/n6OJH FAsa]=DQ^(!px7XEN4yElOR`h NȫjE$Ȟl 0XC_V\=k ƗbQ"2o7]O@tPLpG#(B {;a]Je=9xo<*yKYMѯvsЇ{QPIDRt"cE,Fx FV=!*RK *1:\kB/ .HDV_x G!DP"66(.%HTEmo`S=RA" vXlihP|hs˧)7wwUEb$6a:y+TQ'FS!#l~2J I{TLD!{)!k J( sǸT'z.A)O%}v`H!h_.&|]^n^ ?ns ϕAEu\.8xro]JH3>б9YZ+oQQTQ*9^,+8+7"z.e~~r7퓑lveSUuEŠk`Sm̩: }zAi>Ay{f@Naz(}˱\SuA\Ѐwy=RX}3׫(te0Gh' z mVrֵG TOq\r#역(Q^o04uD^ '+N΃@A S^󏇛<@GºRXoq rOģ E<׹l e`}z{צLl|?@FB1ʘ$^3۵АJ^z3)C' En׮(o~!k=Xc|gKvaE-\:>n NX~緖S{Knr~/Y,׷_" ?om?s?zkw,_ݗn?vyluPpA4c 3*n3U!Pb{͂?y/nrIeD:F=zV;eǷ9F6 9w2GY9Bid1*/)Jם J)ش;0^5^LVHK0 Jjw"ĆfPKr)exM5ZokjA JTΊmA OKkgV!Iu/2yqh D;!BdXx9ĵ_֋a̓]h2-xJ/*Le"N1 HF f%d583é3ǞG]BygT"2BGڭӮ`]3L*t03OވJlgV iy$JǣV#djHΟ}=Cگ*I//XPq}8&W!:}؝+QYaH c835m otmZ<& F Rj}WJ4û9fGkb>@$>72̬`1|uDr5iyu΀ϛ%Z XD80U4#cJFi=ؘc^[W~6=VTtb&я!qTV=B.5eB`4?OsO܋{{.L^]2US|]T/1@t܁s^eBzyI)c!=^hY"V^SHInr &r+^{9L44],=,YA@1-Ga=L&]ڣؑ mvR{5VE @pYvS+}q5G2uK=uҎ1B9CU|vU?0loe) O<)AU. T!J+ YQ乔%zYOvujbe6|:{(=sv`Rt;i/MךuX9쀵pV`Tm`(2;渀CƧ 4 bON-uBu2{nqp̶_-e<~!!I>P$|˓>p$ "p<}['խ3)flDfXq}<+oF.Oס)9GV9 Sp=ͫEoA#<@K*ld) DSˢqY{`Rb};;[.*D љ'BF|1t;>5Gx򽁉~8^",~^@gsH)T%$x:?>Fu)"(&oeNj6/% mVgnX)%J4cnq>ˢD_Ϧ8N"h=Ds.HY15y&s8e\X uj7.F \*=BsPnKe%9ØJ$ 3'~],A{G5e띜G3d$+Eo)oLuqxYO]4s)=gK>Q-(H[8nn*!i'w{zx#Efivg}88Kï5qW9bci F&9??~G}޼:ZBҔ^J]*I*Yr;]9efx,ewT Hګ8+d}}\HT%iF)#Sh}iי\1։SDB3dUhnlky{hp "vp4@,&K.2GJ(} m)'~>|- 7;sJwkK`) %-~D%|0l볺{Ō$.wmX<,/ 44݇`kʉ#;J/i;lK6e,gJjDwj DC]m;udzKK4cTw; DMa(])!\v/BvhqO>_ XG |0E x#/UrF 3YsZzb Fp=>L8.h' /"5 ?X\HIY>x;oPv3cݡ ڀ>:Zn{VMX"{wUL@luP)KITLLQ+2ͅj&ѤȾOE;,T-Tb:^QZY 4GJuN8ON;AJ O/-s (qi3uyZ+& @s(U4ʛ(}7L4B_1].L!ߊ=r!O"4y'ṕ")&Z) ?[ I/Tr֙,ᠹ)t'9Y#B+""m֒2TP%8(i9hxY${ ֺGEoBxgo^DLK+V:R}?ܹQ O|P9144A~9EhXET9gR ,(!XJ.'UD0"n60ʎ s #r$"v" Cs"M3vDjcFz'찲q"edc`CȆ4O?@//nƀ6BY裬?Zl.H$Ў@k"{MUں$yU=w#CbST򜨣ߝ:ށ;Bnʷbu&0BY[$9P #.SwnP^ W׆M7R EіsPB G^B.?bW;&SS>k$W#X+LK5DH(쀔iƹG+*M#T=FS(ۅ1ZKzJuS7>\{<^1}~\BnY>p"]e\vMλ0-eՓ>PoM PoزWr]DBHB~LӬok'!fIļr! !S[\M `=ЅBXfd!?,T*U`wڼ /ޓ8G}.X|qHcmH6vScTT.pB@Z_JV~¬hYc)S`~}CMu2+dcQmEOx'ī0 *qʄu{(ԦC5ᡤV#@nYeVc*p; 0h.#(C򦣈¿lnziNe:E>0kYcT'v; y*-k\al8|DPqW&#w?>NWߜ/JuY~1O*C=bf`4Kd#)ajKqsY ݽiN c2xv`}N =`P's|4$.Zf_Cp8̲rE[HrWbڡ= ,$iRAiDhY㍋GZvĤ`X$4$e\"%;tX-k"]Vvnt4V侇 9nDhOC`31Y{{qJ@AyOH6M-k;[pDv/n-66Gb ri&h5tbYUMfT5ᄉbBKL_ie)T2&8$ ۻLf˶  NrxUr␅>dCaͮ<t8ͪXBCڡYd`/ѺOˏLye}Vtiuv) %iJBRlX+9hg]$ t=h1HNQB@DZP)Ş V35rx2')C)%ˤ`غ` Y_[IAסGb oTv4`(uC9A7Y5p0k=vb((VI+h7UqxcP=.рevA#{VH-k\!BЈ&A2DWP!S8j.ZbhўTjlԲyр{b(x5D$gq,R%:o&[Xݼ3։ $ʊ QMma(>8xj׳XmA,Tkkn~h 3#](l#(.47BY&e4+  :SCи⹄I :N :{Xx+Ďώ C ~bUoEKT);L{M6˒*뎓۽/scAW_(Ljs.Љ2<]~w݇I y!nè5FϴhT+G\FābHK*`tK`$A#FI&y\ΗXᛥix3ޗJzᯔګ$5 Pxa$7Uz6??vsþKdvׄx=q&]_+se$ ~ǦR}(`8U>3 2$Xlc]U``cwM-˨HMq؛wueq Z%?Z\v D G;#޿::hZ][6eSs۬X_JqiVs1)e'hÐ[(;#'>`A^f̨H?Jd+Cp|˯#18FJȭ|812˪B]~Q$5/&_I9Z&FN(ߺ4ϒ,{糇144 gɗNq$sz4G0]_4%bVJwwpv)U_BBR ʄZ#3 z*]Cy$n'v3禷Wh@ŭpڸ6^7o_=u>Fߠ"2#XV//n[%q7wYoS̗`J|ߴJœi\?##'r0Ǧ? b}\J37]0OMo/5D< J"sĠ=F;fXÏ[ ld"P0BlAUQPTF)ZR EH9U0W\5j0([]D6cwv>_ep<٫Q=qS6S*j@1iO xΰ?rLG"ﱯ ׃W8*}b#ڜ5_+<.C7FTO8^vV U;Ŕq610 5лosrϢxEh> ܠWgGrD *?X_d|ܲ20הIiC$!%8w 2/JhowlϙîJ?7md]+3)|rFszNLԖefFw3!ひs>Qh T샔 !'ӀZgJ&9ʒs}l~;-ZYjfPsaP0ݓ5$Nf,lZ=3X+ SsvK Gh>^nh/R֗x)7m?l/afqPAeG(cXȁȴ3حFd^ٮ7fu#tmEpꩀxt6cpח Ϊph+, ѳڿJroW8AR ?<ϖI~ ]"D~?5 sNиc@vZ) 2$8cё+u2X$f[&2\ձUxy-.юyȪ4)_-{jmg) ,$iRAiD T,4]>'oN}9pA$y\wR}hX9aH_?,J0# (h6[`3ta.7Î)@ J6Ʀck*q!/__Ko*=¾ַ3+V->GafdW̴n^7O5anj@z dm<~d}ǚކ=Nsz%0"#1P#%*(ʞ,iAQ8/1Bk^Yʁ)sAzDRr1XV{3ywQ])=\7ڗQs(e.Gq2sb>_˲o vQ\}Fκ;2gxMq OKx.S3@r> I AuSvJ;?.N PR) Œ7Yx@^O ,-0$K"9VgĐkURmJC[\7Jme`I?waN^7{>HgQ`"I}8D\!BBя鯴IEa/L<8)3N5PuRpĔ[&Ѫ7m&mWFYBx Vg k6QVܽ8l|[ۮen?|QHqtsUJ%řh*dq)Qۥ<ҧ]7$W 1ETˌҮm(0> s U\¶K[J&gה.Y~oQS;g qEq~V M ]NlA}24pI3ުθ_/镾4XzdKi.`^;~|[ 2[ɧ+G|eYMt"* z|t8Ҋ4-fE6ձё-˿]2λ>'[r!Xp*:?:-/O3ΒcI[g.[=7YQy+jU!N+.kbh.╷|VqJrAȀ.f˙֏,A;Z9~hu`{(A*}.xi3bLK2Sŋ*1!s`~$$. {M~궏m3AKI]`Us^ #LO\(5YGLׁRp ?X~ :ș|T/z`a,,zyo{3^YHNs& jǠKW4%$4Jcɘ%BIDkXi] ?useq1WurF=H,LKg.Ìq2r3urSSBB?$ 0,0[Kmy$6/;n F \-#tELuuszHZ\<1V#_*kk2&o|h n@Dί4< e [Ͷ0і#i.uF FVCC-1 VS'7)kZ'$語 /Brz$'N:P?yYB̂h{ȐU8SֲUZF|6ͣNf_OӠELqn6(|}X*x,yN2?cb[T,?198ìt؟E?L8%/bѽ={So_HF?5 w qſ}Z>?WE mZ1>^~f$N1?/t6٩;,:*?O~"ms->brko9 gt?ZxJ8S ӗ\ Z.Y'N \SqCe\A~ q߇ B({$2bے#4F:^RiEv %}&pP,[rD}+ExRfs"hS",-Tf+0BrFkþ**Nd4FJcګKX:"ij88/Xce~PP*B@X)~() ؼ 9;^$b|y3ĘHX/L}}kd& }!>x3|) )MD6o2f3%C4^@7:* P͐YKVNjBⅅS-31rYR:.% {IlN >j+VpF++`R܈RYcD,gmKЌ$>ݽ` ^C1y@˘[`QFUxTg2P=%SȔp倻" +%:=pť_:,q0(%<'cAσu)`f*ZNHõv *aB}/`#%ڋGcJQG sqF8,p/=ջh庄1ReY¨Uw/nC/&Ȍ/ qZs!@nXΌ¨4h ΙBYPN[S;jC3sFmJb0GD Þ^*f t0+i)]sh:>VI:I ߅MjOC6gu)X+Dgd38v73ndEcÏ[*Wq$OX뜰}h,C w ,xfBZQrLMƎcc E@WW#$s8/ 5Ëw\BeF{ ZͰɦRs+ Mgwy9L!a7c^" OwdfCP$V8'XMhnY68.Ai/0A pfݎ+/=[Q30D涜E@)~QIW|&BgƇŬCx|{ q 2di@ D!5AY@VYA sC9gk!nQqisqУyM7f܌vt1Kgы?amop䄇fPg1=9r"+nzgAn}ZrC!Fe B⸽йHDb&>/5j%F" sQ+bUwj2J"B&3Q^|)5 7Ƙc pc峔2|]mM̀3%q4k-Ltm;'yLCbX)/%vlcseR4;ONV;Fq"ޑEۚ. ?7D/I_$^햟_gL+?[d>,S:(Lw(b6Tg|֪2i.S.!MEzcL]Vg`\ &xz,@n~3:u.#R`-6(;}a|ZK ?ždr,Rq|=b{s&2aGѿŔ[55D?%mmld%:jSˈ.w և>yFzץ)"v|8aI^oL#tU3.^?vY}P#YqvoƝ N:o ƁmYL)q48P{L9S:qҖuYV??Mp.jŃPj,gkAt;)ܪyG[.KyΆP ä\՘g.0&:*T9pdpUg6^v,n/e/!9!wn:*J [P.]!$EPx(Js%w%p)G[xo թ)jX))b,+^g mH>%Ҕ -X)u,!QA% v8skTxE PみN_lIj8+UnĴP#_σ~wT2XQ<'j[A㻈 *o/p9VfB~,0u+356_?~K^}ﳨ M7~ʟrmU֩.=ǵ/uT#X%!z JqlW|a0XOU~f A,qR\.}I15Z fB1;|S-i[!y `2 lÑ u]Yo$+^v 8y3<3`ca1h$QJjM,]YGJ̒*vKɏ WWG/ԅQ]z/翾3_PBJ/+YyeRO7PQ: /LWTOʁO;{烖kz41 f:TPu/dH{N:u4F6 u1pW8*=j=bJQ@s \x"kڣEH[CQ"w6kw2]5R}=E̎:dG(r#׺%2RE]ᩇi+G4dG.;t>#tz0${4Wpj>V"hXRaaG[BF4EŞ = #`aTY {u5_9=ڤoLf(#Y3I,6njTJ >fe$@cgw槔ޏ 4rl +)(#=`sIٓP0e:ܡR%fzL44&0U>M]iZ ($ݍֻa 1UJBA?y r!=B܋e{Rj48j.FOHi@c*pPV_D 2V\x[ n^rxb}=X8aWX~D~k0`w-.Cc8=#ڕM:Vy(^\a"EEArxՓf# JӅ>9;!Bt'(UX\\O[)r=zK<_@D} +%!FA,6:#;NgBJ@cmx*PjlziJj/2;JT]%1 mt35bCy}uz9Y|`:FSYN̠(ZJOJ أdΆ*ή70c`x\|;'5Qǥ 0I@QlCIE\WzF~%$ES6HO b0ͣPC~}1< z@<(`,&d)WRg2N]V?RSB|K`vzMyѢdқ@c~7Oڈ\mncR'+oV^J߶zrRIrgn`r=#@c̀)2Tn.Cˤ v#ǭJlӨ$)@Ȩ#D`[#76#cT$ۦdmL=>P w؟j|h錚iB>0垽b<ۆ:3/OZTM7\\~&' ?*EZΌR o%oȉ(e`EWd~;+Jq:[8ڍQXz"sͯӂ6d!5ɩ BIȌ+S,SClQ 48[d`zl!%UtH"wh Ϊ; NkV9 p3 ^;z(_?X嘱_*UG!o> Ὑ怀UA{b.,m p-~9 nqFI-~'r/ ۳o%>MInj[m_Y HEždG/Y:FJh`0LKtNB9u=KIL/7d_LoaȠq:=H<7vܷƎk|ϯd w2Édǿq?#(q3(_'Y)-VZT 긎JiwRHimAwAoCl>Ren bzvEIQPe"4"J1<-P~,)#^O s^+jyW ws8mQi_Lo\v4e-2AS26f.02Tܙx}UOn|FZjd{z`P/)di5pʤ+@^J,ޝ(ӞQT'DJ xe֜rJ|+0`w-"zL( 4hR1b|M:Fy=B !Q7A| 4PaM=x)f'3BݜJ>s#(i)@#ǣ6 R2;?Aaj~+ZVoVޟ$'}^TalVĬsJGs-TEIAA4!V#@/Q$QcPm03Ak? cLFBpQ4Dr"1V- &UiPE/lQL5^QcpȦmvIcAʂ_M=Ȭ,d|,hI(V9#%I鑉D" 0BY<t*.$P*!A؛nG^*}{LKd\R2(rJ/ %V3'pp&,V1Z $pWdBGGiUHrE?DE?qj%ӳOȦ:oJ=t< 2jjͭTڢ$kLW09t?4CQeZ@bI-Յ@|wg奷WORّ+/lx0޾PJ*P&ByVd!"`TDT皰uA-N1[b!.nT՞۬Sڷm2XMHHi[|@ƈa#y-BŊj諴EOpf8X|khnƾu!RV]Ӎ,zD JVB*60A5p1ɜ|3-,OIPh]yPB'0o"ƾJGUq=VA^ќ|l+zB ߁1ȥ_ Ol*e]DSjke 77]2Cj\1Mpi BwsdL@TdY pO=u9J98-%\y_/DON}g۟EM{^_^_^_^_#0l,*e%^hXiT"!8*QW-OGWBNGW+%FkIY-Hj!K0VA1J\qU~' 40D'tGdpP0!jw*6B j!mPzWWZaެsb65ͦESgf(%^M'[51jLIhLiFJU>թI|0VxR=4k[Lrҙ[kFn#a4ΪV/ xN 1HJVҌDR"eͺlJ;>z--{]aZvOۇlnʥ+``!!50VwD6]T`9FTvAVQ!jRq5dXUӮRǪWD[Chm}4pcoFKiT$I؈ WE1Tcr)3LS-vs'2J\t>`Oq!s&[2Iyin%;S?wx̫1&@s~šԝ6i??O;dE.}RoBxv=76S%?RtsƓ<Ow~mkc[0ސFQcs :^I&uaf2.K-u.Q*NNF {U?@o8ۻ,z*W ӉXY5#BV$b%T(-AbM Rƙ\].KFR4IVeOvRaL+{'s~5awÞUCCj'֑BH3)<D[^=-a,j>CslvFnӢͫALMTic1Ӓ뉜Fb 1#3`M!M?r1Tmc>>伕Ւ>[dI;v1u)z? ʃmU:,b)EJ}Ai^f%3'Laضm(*:R*U &c@؝6Xs̹RF*3o=~f)~F(Ht Y+-6Cl=|I6~_! 8F'o~T,Wg3:7K5Ԓ'wMs T}  Zw< (w7ǼgVѩRiBݸTpFvr|O4d.Gzθ t{eyD|}Mc XqJ+}l|sI }:"+a.#)<OS)Wܨ#PF[]2O6vv# ۇ!v5Į>=^~1zG{+|9Eŋg 22$`ǾwةD)1O3VZf/cNTn a^'dI{C/if}ycÃ.@"@ݿ~3XhA7%=s^~9-yûgLJ39|, B";Ww zVlT\|l\h~af#{N"(S5l?Ive$KIHضm5IU)zejpfd% dA'OaNJmKݧ)IYz,_*Uy2 |Hyۼ v%ض#k}j)=>Y'Aho7J&܋imDxW#:WŹ{A)VbP3ԔK#K7!P#هBI!Sqg~)KH8U(A*D'Ֆ㻴?}yi$OnvTyxi|Oyg,% { Ĵ꭭j6O&3fw aSԂ$E,f<Su)]t%lb?M9y,(г֭VҔ,6Q4@\g0}Vɴ}Xۼf!D-Hݢ!O{H59E MҤz!8^~cMUӝ/U8,? fmh.I[줥Z%YWhV4ޣ向Q\lT֢Sdu}2yK tRxpgi+mqƂ[Ll@;b4v(e^}K^ ItV2)bde*jRT˽{b=[fGZ#4Åϛ 05x}ΦlW7'g#i ƅ]"%b&<Dՠc,ŗ4vI& $N;.#ޙ=m5{-hDz;~Q~qpu:?Z+ߋc%1YOU c @ZBKPsjeŖg2 IL:uVl=b,X=OJ޽*ubUb T趿y"! s:$A)C5Q|MEҗ~4ER\YKOg,z?ĴYTw2遲  @;S5@o b9g3V 47fg%D R)'nov1Z|>RTZ~6 rNn̰t(^Q1xoZzY::%u#*>NuZ˰ևV#7 ,J#(LE}eز *[m62/ xrC)}2>؞49"W,b;rM#pt{FE)L.A(ɣǃqvZ 5h{%ٽ/4J߾`(cԐUfuE^<du3wnX75y!{~/fV^9_l&rNV[1YzYP**E"ZY]daBTҮT(V*3,$ j)ոv3J(ISy\I?N~SFJxpUdרt"% R-Z#NrmMJSU[*";[,-A̲u,12.)IK\{6 /)l21e?lLk˹<)]D&cS h,ȫt;9bV.WTB.h&%D ̭Q1DJtuqC D1:]k~y87\o92a}F#'$8M2V[0iKB.|K*Hf-ٛ(5δ9ak/Vk9_xv<5e&CۖO>8{0Hk<Gr$Pl cO!&if({1ƣxfUxw2x 5`o 2H!g]Qk8)xf_5VTfWtn ױ"4~x6t qXe۲+IgbW/*wW輝wDmnv|`1E\( sgWWs5O߭_2fm?}jf&cx>to_wڴHzQ yy?xp4 @|ۗ\,ߢ!ϛR(MӘ~+Kq8!j0JM>K%GjeB`^#HaNՄ5eM}/b_ !Oka 0op{|@>;s:DAm^4Axkl9>7sdRI@ޞ_+i豗av&#\HЎCd>C)xxqv 7a옹hRa75sG/" VhwTM']^oU/l%"bnDյW/N7?juťfGjE=tAZDُ(z209c/2rօ⡽o9H;:ap0q* k -}HBLj2JG,'擎&iH) -ĚpVKQ%uX`y `6 -s 8`ĐkH\f@2[thK Xf?L HqloxC+\9 05vM XTY7L)ar2ɱyyb38Q3_1:a[T!"9Q#b2JkcȩN85# G*2GCByteHMeK@d9ji&F)@GsM 6^r9c]BSPc/~!ewh=ȓUt %@1m1Jk3Lmh՘IM(A$$5ȰHqq?6NI'HEL*F9C׀f8's`rLc l$_ r(Nb`{N͛XD >ETE|-;b3[;G4'sqٖakQɽH&9{ސ\0AWe.An τHzٴqbTlΑ1HI*1:$cjfƒlN%0FiooUJ_i˶+b{X&^`146]4.kuڣw8R0Q@mG 6[d0ZL)jPk҂iq~̓'X^UB`h;.6fj 7Ęm}'!-wFJ~1[*~`Oѿ1fE b ;F_KC J:l0>0ra)z5ox8a#5eW)ȷ G&Ӡҭ1'!2*1o1 nڹ׷&7BbP9*g׭5Rޜ1 B!' ^=?VHLrj_ڻ׻O0'zS0a 3ca&`fs2p ! gc\L>?X6d z~} UfvH2 Q@wݍѸZ0һf]e9=~ewG qL gNV#rTv-FNBc2h%KaMBy?ޝ;~~.`hlHs-숚8Q֣`%k)\82UESɛ864:ۈ[hx|d%&XO"ȏÏDpɛkϚsc$SW_?{~Fk`W'[`$ZgCFZajw)() =UD%3WÐ:$C1+Jj$.Xsy9|Sr4kzkN}%{ΰ!sj 锵79{rcQxQvqtqE5F7ޜۥ!ptW\si~pYܙF%V)Ĭ A5ꢀ5ĘNԮtrDЭkK iq4u# 'x^{2dfo5Vi^jb}:Imf*^=MstEKlh޼'83TQa:%COE 2ɜ/z sR (H# hk d*S)oوɟ :ݮٍ]&R1٠^۳ #gY,FiVޓy] ofb4PÔF$pry̓9La?glyrT+I7@yXLw׶= K[0m~_3 zNtɃq!K8)f[3K9ߒfH8=}dHkb'؏!Y7fد,5~Ze*簾 5p%ZbMċK% pTrIo9MC}Q`>.™ .> k$ԡ`ۿ 2Sif rU>7;1haL zn>$25ջj1IXONtcvݓ˟\#%[I!y9Ft_⇋@̾~4jCr~~*PҨUc܎0z 3SSrT0[NFiФzRIBC([c-ԒBkdJSEZ?6Тc:e>rlF{_ǒ*`!<0! S!Puy̓)#  'Mo .OAdOTx[E)aju>KrtzΎFqTffyC,/gz(at8A7G-"K4(fe]aQkw?e˨L?RT7djAkԚ?uΟZ$Leh[zZIvS8ab{=upwnzdmz.Fh`]g-+,L褷јJ-UZ8k5<=nױ b5ܔ{ERsOGd,(T=p?A38#T<8 2/a=_F-=`hy Fg^cCmWI4rrů]5v,wneJ8fjT\iIz\$E_[lEF -xI7O]?}\ǁOyS׽abw}z>9T~0S~wLHQhӗ-e.=~\1.s%ξj3ҟw,LH~i&c$)]Uw=2݀J@&͘%k1V!sL mPH4C!x ?i'oߧvs,*f7>$SU_Mz1b5j~ûݚe5 RTšCn5jP3CHX*b6u"j,hvPpK]/!u&ְ?nnW.a _or46N.ij>}PᆤkLҔJ|QcH5W4ie_*TllJqBrv~,.H  #*l*[x!m>zllYu7JL5Q:PٻjC `:eZ黗۳y+Q)+׻J滁+6oP"k﬍&{}v{͋ZC HkvJvܳ6 y{aD_OH^:l6_X-ڡގ3qM28 Sł\49fʌv76g9 ,rP/ӓqՊ#}K[?Z|ޒt]KsSZ6hTEETc12ZDȘ:3;^wV #x@J/mr@v@t@Z$lD;->{<]5WX.U4gz|-ӫnwcKo*DHѧS!$W?xs&}<1LD ʳ>kH;^?>]Z:d3]+GMiI;k~>wswXa1]=7J.|t) 4ٍnx|n}5Z,+Ѻsѣva6]vssI@J~t_>8oQ(R[1|?|85Hp0wƞ N eQMvlŤƔ+Jy%߬cf3fN>0DrvXxwEFCd9B+g˰5 2)Wt+Bky%9Xm%/d![7Q DE SZNEXz<6muGޢo \{Hjv05:Lz֟ɞXC,)?i'ɎKv|?iNPW\#RK)TayvHvp؜ K |1x&C>!'mqӣkϤR䭺r*STf ~QYqz3zzrpB]cf'mTznANɯ{')X 𳓓Pޤ!ymO𘑛|vD>iS=9h"kl1}|2R6Mj~Ecۅ?61+@]ט\Kh7@iS:FDSi,EݦG/?vĎc\{vͺBD  nˠp\!-2>;DNhrnndkۈw~/og3~I4`HPkt*ْUIQjo}-I ;/hklkzrux񎝀/n謾&BĊ&{ bӛ~oG˿|?l]BEStPCS1c r&g"()Y%B H\ **qq M:guQv׻H2ڀ&3oyڪ`uǶkg84 ј ănk1de&I/^T ulf{=e>]ꔓ%ڒj|+4H# Vu\6FgI$k :ZRLB.eNSVb -(jF|O/nC/\!-SGJ‡\HCӣ!jglTXBGMǐ5JPW$j@`* 5 zJdb* LdT0fQ]t7XeX/}MI̺RPn Cjzj.QQx+yL8="Hb7\?2{HE rl,<.Pb)rx ,iwl"<3+/I-jw,˦iV:l5OLm&fֳBMʶ~mU؞;M/ P{_2^'9\)$,:4h79Sm&O>fznOXa;뙄pt797m5MELx[мdFS ?7~H:;'|L\a[M0rli-:`O-{=ypgR),Ѯv 7 k,\>/ J6^EW`bM4Du)*-lakۨ<čf {)L)F.衴4ZSTrWo2}ޡVe˿lP$>Ͼ \ӣ|sΎbĈǏ@`6!–!KXlYaU>3PjNrEeSJ 4YWl~!Szo ÷5n1AӀwެ;c3ԕkжpm5\"3?>;2>Ÿ-Pt\ư5?gtnU(fVs"#X"2z&Nט 9=f>.YkL*G nVٹq" Ibte^qӘOnu:Y7س ֒HGOCi!#_ђVٟtטdCYX˒dEȀE,L->.>{R2EZcw ٛdRN1ؚ[ M2~Pi+1IM%%j=3TUT^7X; WiaT4EjhGGP-P|9IzpK4&n=™2ITM5^Qk!يB,*HSv̈^ $ųfu}nZ?NE⋦y2I;&yiƨG41=:K "t87^Y`)z\c1PIa(2Yc@Z% e6R<8EAH2aZX~g N-LF9#~N3YQ%S`C`M,Ytꐱ%Ur.VoȈQF&l{& _Tzmo;:صz,6ꬄ0"]0 ""l]$Z#!Vտ+O͠)yateKixUXU^'EZ4 XϊCjRCtNle0Y+ZwMgZHB]v~Tb$$a{7% NpU_#r( 9DxcYltWuWׯȨj mZnPCq9VPg?!-KqNH,ВfYp WEi-GeTcÎHK%ҌDHY%{$2P"ttS(68z!g?B}[ @Zj+BPPhae 6GuxiԜhi=2ĢMRTJ'+#L:bJ\HY)nҋnRہ~6>m&n6)|^Ey7}*H5u4Fӷ}vkcs~JFU_8z"j~/^G{M?EGIK;qRyw'4p74MF%EZ-Hvߵ䔐 Dɵ ! 쭝DRd^Gox='+ RS^/ R%ui^a"CWrJ|Ӻ5gӕHKtoSޠ0 Ό T*:^<%N`x>D{w9*nȁ^@(b@D( L{G#b[9z "p5AK*K#NA M+\u!zWFf"n>Yd`ˈO)-MnppT6,LJegŠZSEM jL^RIR JC8܅HEA^HY*J͘OQ!`p\&|cKh'ԻoFNKsF~Ny5w \.iCrxu9%)6wEyw2nx%QQː5ۇw3ѻbh[1!i%a"K|)< X7MP\aowۆ(G*C=v7M<'vxjz3,xޣ*.x O{y#0IZK=aTHlӳj'"bpRKR愫p|L?|ۆR5~r F6#RCQϔX ;G75٬<=x[=-nvy׉^,Q?,y=2(aD릣*S.`?3x]vl?ew]E!Jƛb _J:EDZVM\"0f]:gP*' x$4QR[&Fh:wt @"^!MzD-r;ӹ )s;&1Dyt\v)SYSS"H/krJ/sl$uӇ7օ?I4xg3I86F7w7?Q#F1[TTT_?Tg?$z0yP+]\Ǣt"HI򴽯7^}ƛyrTj5ͫ\uw,L)SMY w#X`5^V4և0DUt\9= :\1vJpeo ˴{k?b]L#߇r!>%bw j>h5Vj1khbdNo :FVO%eQ#sƳn4C={0|%H1rH'6h>QSNtss} 9AhO*0 hh0<'F͌ɑ/=2@Jc=MLøDҡ@Nb/ETnNR?h㘷./YvJqd#’u;$5)G\2 :G<䈇f4lQWB[͔eAm55&HDMp _(L|(7 (5{V/Eq;` (#Qaddb:}d@WǣSɓ$6 /~-"'_=I .C[#åoO;|DoJo~_d6]5fk`o2p#5X*&ou}0ƗJ9JK\򱌟kٜv2U6=)3huRT\T!/uZ ߴTu/ZwFe!ppR55;$+/jJ[ay%:#P&ue%"MJ3aW)y {D Tx9“5-fphpWzJT=~ox}wPۛRvbQfUsNpwbQ,&Crx1,]F;TeZq EFvZ/ҬqWTњhlD#9 (d;܍óf5;d#ЃКr#`)%,SjzU,=I{!|tЃo\-U޸ƌՙy֋ry"$8E6WH+SVPΞ#z_EƴkEGs̿dƷ·Ik*#bb efbyJs']@B + M@VXmXJiEcsPqRz~|1WlRK3%1o.MܝM6GIYϏ#{o7ZntY1[ouiwIp5?Jo[@}G~v8<=kzlU\~ hӜMpYP!M۷r2nߺƭP5&C% q.) x]%XǪ\/^_Thv4q\.~.e%hRe;Ϟyd;1 %\A̪'lU|a5e,*,?w(Mq$ad vGw֍[/7tKJ%*IqdwֿO_ uBz<ZOEywhyV~P?"%;63ЄY3*J퉚<^\\$R;-f䖖. 81XCO!FXpKXXnEuqp닪"aŏmncTV%.|o"G>>\f|H9Fvn/f}ϴzOڑxcCA/nKNDp&0Rҥ%sӧt_)jnFuFĢǒ[ɢd9^eԞJآsWЊE+"u.MrN=TnbeT ME%I28^sP>- #xd ք2R[R(x"0iH/PYT/Mʃ(Q[L,J.]!9vDr􃛪y 5\ b edDBAJ[Uhe\ K E"z)BH[rЈƖS %(uxS<#nD\*8)cT 7H"+(4EEV](Rfvd"""ې+DV̍ׄԵ0zhBwz$[Bp?1˲tQcvt/Pϑ-(;+a Z2yN;p;;-<k;[/:6䊱/.`o6oďZ{#[f GԀ1 ltR`s(NY-Zfse:/R$MUP,yrVۤM&Tuj>PrЍh-˲{; -܁H{f{%&!m/;NS0rP%j*/wA N'x3Pע(UAq#4UF6oxoލ,x[1BlQ9 ߝX/]K2"%.8R j8_&QsPbSZsDRFRAm^̚x(ͤ]0"v˷a'%F~Nndd|ߜ܇ ?@폓qa Lf>ڟ.>9Tu! @)IPPk}啩b`pw^$ْ'!` [#T4.M9nz׷7i=s>5BB BX]r[YwYO)Mw{{|yQO^}Pq'Oekŷ fTskRU'&We)KtTШZm>vO+axSUWOZvW d9JdDMmV|Ӹd{l`⃻Ҝ2**9FqCj< 5{zC|?#[6f>sJFr7C0#!#"S Ԉ}/fO h2}nI0+TO2oX_ޣlՖCԂO_՟ىKl?]^OTղ=U((}50]?ݺEvڭQ\(QB)D*RG6y t"[/ъ9kE ) mK^hŨx:]b9s*RԜ*yMZQ\XbBanWٻ6lWZ}A L01Z-)!)'&fuIʤaX2KwRgtRhiDAJQ>'; F^?*KSO+5BԘcL7`ך<'JSWt<S )ك<Ȩ1x<3@^6TX<)nX<'̟8ǚ-阏dͺFͫ_?_zExtM&A|n146?%ur/m"׭~s^$LTJQv`*-s:qAh*072*MZ˷Ñ5||gIFwh9EmE57E&Rsl _wr w C/Y-uY6e?5dC}0Ys5{%e55W4r)55+})0{G>?ƆdLj^nA%S!W*&Ѭ9_RqWsI t6*9?p)E~GH ;V`% 3yb&cAD*#i$ڪ=6 (ϑt+S$# -[h\3jd,=$iϭXmzzӔg*Dl ڨCw!Y1QYGbu~uh5Fv+2 @W< I/P ﯂{ѪiwSUP)ݓkus!eW1gM3JղNFXg:Xb3$jDrgwgqV;&`ہ~g#Z H&&El61yaսEbdU w#I>LPRy~iRU"Hu=糔d:S懇uYU|LgO46 RRP n=T~/}g>¸LN|B֘mX4Z{&Fk߮lQzK2on$3"LJBy$lQ$ DSFGic(10>ضTߌ) 4 "@Fb3+0 @}f3=J>L-dJ4чH) HrƉDeBҴ)mJd>D0e15Zj[ytR[xz.bw.R&fW̳4j7js(fv?\;{ctl6zՃ5ȷac5**NV@ )&sj$.3~GdȒi`J7m9 >XP-3 dLdF*δVJ ±}u]XC~ SxuxuW)@ͪnR0G蛕Օ+X(G\qЧfd"<#kc!9mܧWR;j[χ syI.ӣ^R SDa]M'Tx%3~c{\-%S,d4I k69/"qx{`LI"-nxT u|1 t8o,&nM:Ig7峉mn2eM .ÏïTā:po0$,}σ!܅Cѥ*`ÁDx1IQV1_V ԒaοR4B\*$#/Wn]QZ)[HkzQv&IeGoRћrѥ5MĢƈx%A.pS!Ca|@XS83_l^$BTJηb(WAI1RIj򷁭?bKO/͕oW30UBZ >6dD:d8L5KeNR*}94KmD[J, 1M)]f ֲ#OkFH>\ mVTw{g1@RNjx-(|U^l>~E|uf+ikM >s%l(r".&χÕpX7;֍>U҂NDTʼnuL19)2bFLLŨ&\̜&JDBWEDq)H2hYTD `LUuJ0g|0֌wnZcf I ʨV݆{6JmS9߀Ek?I%FJ1F/!t|;sEWi`=<ăa0M|~Y6B_ [~|ZUޅੴq7 MIEӬ0O?yG 49)>xdOq!MV)VB4݂ H O1PA!vXZ2Fp@ϱQI 1}5@㟪;Un]|=XGlF2"1\ Ycu4UHKo,!IZ P>o#Ō+E /AF˻PZXRk#~ t d_$xDsh(dt#c* 3hKQJ)T vcc4,#|gK5wh.'66f2Wj,m@ÚB!a56N3FHq`&Аɚg0V&A@`Ã\I%YD8KO:E!Kg@J(4ћ:090Ex olUo)i^9%Fz0w@6 rSgbƶ7K *U{:Bˆ7 aNI4#y+*wS{VYe)Ļkj9ˠ\gER%%R(]+عo.]$&(!j!5kWck4Z'7Jc<,:9~.4e/)kF;e?uϣ7,)TxK;Z*W5.NtދU T)ofe1V^3Py%h*"5ζ3 T/K T@5Vr'zjTǞ}AǝV4Վz^ Q@5 q J " Gm{]K*t19 Up9)6y.PDc;qC=j a( :JwYzpwcoF3 |ߤ7eu5dXd䓞EuԘFL,ˇxV6Jv5PZI ZzX;U+tÊ wD{;Z%@ pY*Ps|К@sݰBfjr`4u =!%guǩNCȼ )9L3K2a3ҖHU\ՋԞ> ٫D/83~߫{c+h;^~Y& ~4]䯾I䯾)]00jn΅8Lh[bBQttAwQc䒓`BNM_ԨSӗ-auR&YU2V^2PH!EX : |Q9#R4j|]ݑ&۳HkA@ ST\^ݳ.EB^V{`aǔ[*3a6, @K͖Fem5G؀XnzagI~*(P_x䳾Y*tLy8Ӳ}z$gQgT') Ew~2W@z%6ZGGO33YQrYOͶ(L4c@fDKAJ!.6ŬX_N?yPBʹ>cJjJT+txBA- bXrgC{Vy 댔s)<Lf|O.h(/%Tt&P)IƨgEb1[Cq+i}U__k8LwJdbpJuzW^D-xل<%0VxN/Oa;U A%t4!;;?1n$9!33H FB/ cZ:XxDZQWu4+)WJs} )ΐIBxO}g֮7M6$K]ȼd)H8Rz֍UyZ5Heq6v6ђYlt$5ZQ~,ywތ/#UZȻRU WGշ/~,YN_{%Ȍa{5o[ܝȮZ7`JHʶS`z?Cmt($hyKp1p y ts`Npc3 oΈd/ BDYc"$k n57SO*[5v3w;rE@Xe)7O#)*öqE#-8z˛[,LO$ F[ M4_IиuLh$ш9cEZ };SEt=wR[UMJJRRP?C[2ײGAw7G#z^I4M#)szR~X^gDID '3FFd3soo2;EzԺ x׎Y/X<>bU4^=oVǫ58ku@r$>5︿BtAk"QY8g|n<ƺWqVͱ(7Yi{ -%cV˥.$dH"ɲ;M\Te~8 >쯓V຺βޜ7'!IzsRzS"VP-XJS$\YbGh"SfYdQ٦/N0C akwF=l$Lɟ0sRLx\rF hdFB!-3F"p֓,ڬY5}ƚ aTsQm2wOݓFs@Vs5ӌSJ<-y_URUoꇮ@Zِd+&aj }IgpDi5ķZIF3˵*' ,ܰfjAngVe5$MlC;lJ`(0ljmb1WpCDFRV11R2!T# fHhzI BmEHG'5*ce56;l e#"x3CSSH!r\# "blpK 20zD/ uFM0#jb{*Ҡ ˪5(|1aS/ Sm3\#IL `eBê6A֘t3!Ոn nq4qBxaIJQZKxԼĎ![!xQa@ֆW[2[ )< a\D1 I@IB#>9~TgY 3/GꪌpL_W:^NX?s t[n<<3ޜg7} '~gf:}yr7: MJ{T&(=xqnbzo vU4SrɯA6j 7"îahSzPϟ櫎:6po?w7m3*8tI?>?ę y2Mw''tvxoQBݓqy~OCS??a 8|13``E~;7gШYo%W/zYKg׋ ŏÒbl3|^axI_ ߺ(VwR8|{nKoÑ9-nwϾ}upc?7Y\Y)s幾Wdv]j2>_ԅ9Mix~J  wl?gIWYz;?> 7:ʧx. |au/7}ұuZ;UOOߟ`̟C{>}wԧe_ οzeO`g^ +pf8t.1C(\%d(ur3K*+飢8 >S^:Ay^)BWFOgqڏ÷T_A?0N 8I`k$Vw×ՠpS-./l=Ԧ U{Yq\ wN-JriCϓKK'k(xB s& smv[%@d0A5, EKη9]$Figɹ N!ǃcX~'xpckguceU6:R6?]=1j _eYOB3%?~ܘB{,axz@[bx}-Җ׷zA[^y'Y5`N(Lpt#O>AJ( ?^_+ .y,d4 ?..pIVKҖK\qIwT\-Kq*\qb G$h/rF9㏌JR-R[*yӡ,d4 R^mP *Qڠe-d_Jf-ld$k61`PIIg rI(1DHRQIα-.6B_F U9JnHJ}lYd"9tڦe-lYm+D^C!c|dR*UK!k-:Qn,]!ź]!ٲȖEn"M[ٲȖEZ!9Ri"89Tqˀ%ac;<."ɱB]!BPїmY6 ק`!g"ϟ{q){'M̟Y3Mʧ~X-ƲJП:Udls`LO"22tQ+(+󏉅gbl_EgpNB:$9Ù>{$S$q2L(u.U^ "Ά5͌WFat12,hja5ݒӖө_n"Uȱ0yrzZLMwv&@/-L6~yU;xEo 0єktWu)gl"p~߃^)b'`Eon+-z΍Ϋa ! _%wv#38  LI@ia7d[]Xb:]}Gw}7};6|ٙܢG,XV{( ,sWL FÒ)Y(6Y/z"5:&΅z@)4yD3,\A2gp!HdH,@h:PZ{|B|p&< 25|D-zG{ikBG!7bkʂ@obG:kʂ!(dqÞAl}V])h×58^x8x- ܷyn!e@>D'ҝ6BpO!/_+ԊK /JгJgc_|wv*!Lۍv ;G:=MɔH.e .IBzO_ )zRR&c 6Z9{ɑvAU3@^}KP-!,)xp c>48ep08 c޵6mc񗝾qxv&I6H](@׉'$98s9;+( D,02f!6@cThxR(k+8Dbج4θcivt< ="%?~  6T԰8Drdq0MIT$RjbaP~8X5\]F>95OdIlbQl4Z P*a,:q>R߀5`d'׵XPv%)@WCOY$[tPg }A^s9̞RRHMATQ F3.L{&tfLQ@ 0j5)UAr@\PGB7`7D CPKHK6 UG*^*CLi)wpk$x۪gJ|S`Uǒw0و8G?rBgj@h9WJ֐;)X6,hq{I 梶حY[L8V ~hHޕY-[Gh%Y1[I0+m8'upxG U y7.oEZhd x,I:peuQE龽,܃fTaϣi(M y[1,4 L(dQ5CD2 3A2ĦT(qJX]bT*1 i~KH1IM [cD¨ lJ({0kW p#c%R%EX QS#!>cLg)F8\V A0#LeJ1b@*өn 3;?=N#X'Xdbѵl51eFd:_.*sa4IHwyY)*rQCgy$dC7AǗAWàSgʿ#a,}x|>ܭ#&㑝䌛sZh*܇/wbC' jæ:JGJ:CUT{:Ot~/Hw:۠zg֝CX=2CŲP]DbԨsvz8-D@Onkg)#X(Yk&^Ϯk%y-I^/W}dmy SQ_8/6,OC>rWz^Lr%Ze; m|ӔV̡ FɧVX>ݤA=??/wx5c㒣4\BBۄp]*WGu٩te3;L ]r8;Zat\Kv ZwRÄ qd"Se XdLjN3T)(̙E)@jLJoG =os}]IjP 2JڳǴ(^41ėL_Fy rl nO.\5Ѭnxxv=%׫W}_iiC3Ϟ" Ljgij!c#it#|%τIf"1WeS>zZZJwWFgawv*pw5ĺln>7fgv݃}(kˍ`}Lt<'v ljBCa鞊 `Ju- dΜ5TLg8Q*p”S'&H"H0Y=p rgD ݘsSA4,dqt(Ben~)*σWn{Ftr7<|o&~,rG z?ds>ߧWk;C QPGC~hGfTIa+KCR乴3jIw3#,;Na&pZllyۏWp 8"Rv6'L k:?4crCƁ3qnPpn3=`&;&'u*Gc| J~mKE17orXZ]>.W_x5FC Q6/>|r%$(`Q^]rh]hE| Ɨ,ʳO@#ψ$kp,ʘT%ZQ`p=3c2&MMAIjT,LI$%傴꨹v*A^J£JP1o|9^5_a\u7 / "RCc0::@ZdB@vF ilN&3=-cPjeGU2SǾSi56Yg.Mƨt%z괉ٻV^׻ܙ 1!Q@ʕ-_>޹"br%WpWfY"|g|TK7!o^?*5((w# ""D- Ĺx >Q3@&)g_.to waDonNJWAyU=?sƊg $VOt,0oȢC"Ii*I[KZP Pj|TQq!*fp:AQO#3 h)_rapPajRNm4HFͭ%sj(^ 3A-YG*6=%#dBjqGOlb[i5镫zh-: N!a:_D" d6OQP92ANBqr@2< 'Y!{Q@OB@Њ0L'ADM=).?ksBw ]Y}C%4&*Q 3f%J@q"ɑ5'>A=%U]%;$uwG ymH5̔ Y Iy• N&Y5CRcJ_.Jȳb$pp*&l!Wתl(G'$cHuRԮFRT8)pA I3Ԛ~eEKTXi!1߳IR8k}%B%XD<\Vxg-CϬǹD펝JN[9`*F܎ ?Js/:UFo zr^޽k;ؐޢr=Ȥa! V6۝q]35ڽ0c!"sѡ,<dU˥:Wk׌w>S fL*2&.7ARD.B ]l0Gj|8BEpsgOȍxv6-)⮧Z֯ד~ﻝdIWjd~KhȖ0 (OVnwx O+٩{-[VbBk:L @sg5]foNnʚ ! !CFdNdĺIXεzvnx. r0/msr'aukVPxb`h(95ʳkQ~!MLm$KMI1@Ef"e2iu2|1vI:Eӛk R`aq $Ts0MIVcb) 51Y0K(w3UA络[r6 H9BBbxzoGxʞ؇Gg+}d(:x:6v)6k~bh1:]&K혵kT]%"_#}t FMxib)UXiCGT-{֞>~Α%r,LŪuXV?^,:Xܹb=`H'nPL8yv [50ƹnfkFBQ # AIҘ@@IRJcaҔX%qBBdZ1$13DX"YQi]oGW q>`ag! dd!Vz"U%q!̐d MQ]UꩇbdX x2!bvWKoWZY{U? x֒F8J@5*"IPe  D0nf00-ybK,e yJQ Au+&4׺gу1DUy|7Hpz; &fv{sY]-E3yuΧ~t~M3a6aih9DaCn"6risv>Z˩uИF KY}mICH =08 8KJrR!Ź(glFzesT ( @-vA5.;z"$w6Z2AJpz$,֤!&s %A"-4R pTNMQv39 wI8ŜcB^w@Q'U%H2BG@ m%ن@)WTuA`SJlwQa»H86<+!8;[)Ae8ڒaz:om..jR5ۻFI @h$O oI%\iWj5ih3_~ۏDB[$qOv,lV5AhO{8L[Dt`0LB흲6(yf(=zI`R4$xdffϊzg(̘=b<a6X~bӘ?\ m\\oat02o ŧuayW$)84sB^q*aט<bn#(QcUTQ9$%!H4Jdڑf1, ģN,@-.eך YۗyT? xD?g JF58ou)J2:`'W Li (⩈RIJ'eZW⤫^X¶30U e 1ZH#TtH5((gBmk VYT,;]Oe<:|*),& 9 NԸ8%)[_v3V2Mw%@ SFyJg胷zՋ^$o6KHa#8WIlPIc&aВ;tyo^dbTfv_i ^(㹷yC C$l0XGN4Yh󛪾qBgYU2xGҁǵIrL ҎEy4%wYt*xRWEB/!aG5GD}p(mDE p;bfVXC Pl)rU6xePTGob(hx`ʳGN{mM6H Y Ѥp;LG^7ŻF)7 `%r4g쎀0J0;U()mP k:jVAO%j)QbǀCxZ(XWSCֹw@9 ty .Ļd gVG"c?KO'S%n'$akqX1v3TrR=͚=[vƷ?<z;f;Jimz t4T.Pg<0HJmd&rRE&NG}M>T G6oJ@x+Qq8r HHG>l]j}L㞱;ӈ %ץ(RRpHن_L²qzBLUfq *~x1igXx 3kB 8zw]n Уoc28l T&jFR :&vG@̓R klJiH!:RS:(Apu <r)Й R7lU]c%vY8vz_߇tlv7 -no+7R?gvm=_t%pxy}ťXC[vU읰1όHмh9g@PB>("=᧞<  BIk 5s m\3̹߳$]#I4c`u~tGesyDFX ;=@VH*eLVIwW8X/86iUiczz1O!d0KUptP"i)ᰠ.h N`>RE!aMf,mJxzJp%Qj6(z c:c% T 0&le'rw6z0.ï%mCY=Q|ɨ'cUb0@[\0}| 7Ӄ2d;Aɧ#FzWQe'KnjAGAKOnɩuw]=@ {"*%2U*c zU( ”[g;!5ŘH11F N,eLhԹ,rqoWsU]F/)ǖK)ӮQYԽXODG*g1vD_<5^E)e0My꽟P E쨡BGE2j[`A(/Ւ$]Oo>55.6XҚ:0y5u77%{?wows7jf^럊yӫכU=_["&y"uz #8Ib+.|nkں-G?z:M~02u>{K?EC˚˞!Z)ٸd݊w !']؅w7oĉ+}r]{Ic|5'ћJ1O_)AaPFVd*hJyg LLL BTVT|0Hpz; &fv{?x/g]KӬ7?Ҟ/},~ˉK_+MOrzGC9F濾{ϩ;W~dyK(OvwP`&Of86"}Ԣ&␩|qV=ݟ`H|~*\/I4z ~ۀth\f{SN-rZ$ɒ?N$+ad'IT8 mBmJn&oRse_>:sWnlJk$CR~Z},)cou&zfK4W?z{RF}s(T&TpCu5"=yoJt\olQI>k%[V| `η[:H)Rb m- zTd$ltH/`J])gTim_ ~X3-B<=ݲ[f{bݥG">h%CRDk+"hTЦڱCXhYsHȸOT1bBWZYU1uY%n/5VӵL2e}[dejhNH4Hd/q:&ݞ.{M4|(ѼD JNe#/󰂤C9/Bt,KT!GX*?-*I8׌U LއQ t|AO fӖDbzlp8m+}ga3( 2't0Mqz|֤&}kmψjݧ{y8IJd[|)$uC[Z]ٿvj\8,敽uA%BBVI;Q5*/ӵ>/ vw\n鄚AcQzPљ@]'I/ie=Ud!^ޕ6r$Be!@=/1=yCȳE[- "eO}#XI1,XŪ/<"0b{ڇOKwTUE./.8FF; Vڦَ1}p(m(=QGLų%?TD^L9^g ۓ!9#8 //).(`wUI%o3,ͪ}HRhRM"|~;׿BRl)G]'tu|?2@wq ާ44Gz+,rml>Bj&L|8r~|HťQ ^ɪ CE$*W /p'}h7_ &R˓u|b3_3E"a,b02`U!džPg$eR^)<_X-/*җu% ƽ;m rwgRw7\ Ol1;>O>?,L1sOZJHo:0* Vcw~0+{'V>ܱjnn[f|կu|8enil{_vy\?s*81{a}sH&󖉃W {5^Ap8cFȡᚊZ` 'OW?s5Èui+g~yVu*ߋFC=^D.8r<nZ1##MsF0 2M_l Ō ,z# F)#&Q*LFL1,1 fR-dAjy&3*ģ6٤;-!_vи:eź -R:fgȝŀmyم%/a%Q@`鄄 n#|~Va^=Pt "L7ɟ3ߜA8_,O3՟ B՟wn|jb ;+Dtw[- Z1՟/18*|qYAN.P/M Ux ? /BHE4  AECݧZ(A~H ߎp^OG`X Vy֔$'>5յabeMa7BtF30?Tf;MZ[)9S&m,1zBYbBs["8e׺B֭)}G6fJ o̺kݺ?)i[CJ Jҁ"tnRJ%qggʤ@TPEPZ+ތ_JA2N(ܞ &Q{ `Qh!kO(/|T<(o^P@Q$U: OU9Pa6wEd@ScBHdc߶VC#{o">2FG 3L"Jy!ZE Pg=IAd  R80E^pA0ms9 !vK)}, ^``Q!yCD \FO]I@ .-׮7jJI'HCP& v .>\lEY==`&mdyt!FoNr;–Y$F7!'C:!`H2$=C&)rB k?{*e}ɨi-=TB_20-#΃LDsX"4 A h+n[Ü9 Ldb44 N)lXi%'g@ FJNmNb_!Y hh+T G@B"Z$SZ'ٱn8QA蔾u1:J =u+&4׺u!!p-)"N>ܱnpz 1:SFF"4Y)٧:?)$ER0vr簵sf^t;EA{u),>E%?+kf}' 9grrLB~ŨU欛+ތ[{䬊C9߮)EeJ81<7ͪ7AB|&+޼f_=?J7"D:nXuOZՅN OBaHв4tOvrU9{`>\a:o2hR[jm^h<@Δ ;F0".3V.j1݃sq0OВʩy՛u GRsMͨřc8aӒcڅRL7h:9,n[sSAfB)5?5Գ1P'+JXI吔RhTZ&#Վ4#aI\VK1Ռ:SlzDH51x(簴"Rt<- [q#F*Y^*T V%  4!^#UEկCL98ew';sUtq~n.s2?T4YIHJ|x^8&k)b?O7OjOYcS1nq2Ot[r49ηi9=g-ɖM˷nA|/e_l:]BD?719F6=.m֑D6wvr+|ÏDjc$V 4lD';4Yg2%$w+<0KV@G0kgD &u~}z@'71:jӰ˱9V=}i{b{ `OLI{d"==hғbULJeߣvRϺf{Rv o)'k+wpzΖ.؂Ixb:f{h:='Vΐ/04`$hLfRn4~M4mRVSqv7߄Q0&"J?ebQL)S1:dPayi;tBKMmұ5`\4x0,-MrV#,Ў5(f쵝<4΀J&} V:?KPS@7嶝dzQ\q޿MfXyXHSgj6чyB©48SOTc'釶96RdʮgIE" $sF#Q ޞFp$N +sZ(ǍeX㽕ZRUKbSmVAaq|;?t٭ݘ&ı ő8cBq^ ۏ8ǰ9S )DK#Ric83vE}Q}i!ı !񎍥x)&ITv#Hy Opj:)="+tqx[pϷ3MyC+{/7?JgOYˢNƦV*:6b: }`7&sS8 `PRd9}ifE ‹37@"'g(LyK1Ih6}/6w#E.zq:+w|G^ޛ.Y|Y:sV`l&V٫`Z*$L V K0M3E/TV|g`wkō<ׂ H P(IkFZR>WNƸ\^]L P*i* v"Z3cH9TNS0+ E0aIok&_ZbQRN4yShd>  mL6YXax=.,C%rv+qwvmn{3?ay4w_*]<-m7(Ϟ$%nyDm>RyUcBr鉷L +  EL){[cOaBP[S N6hُL5&vkCBpm$S'n>hTmn3=OK vZU!!_6}]i7BJ9gnM1":Mۨ<-ͳ]kƄVnmH.dJ8wݚbPEtQG1nM)F5&vkCBpm$Suµ}&K(E5Ġ4lv;ڭ)[ LɊ2%GZ1%LZLP~&8_gɃ3 zM foDk yH@i((O=Eo3zvD6Y 1.L&KWX|o)0O|S _u'3Y &/r翅niƟ#kzf2S{fW{s7xua>ࢼkX{h }n'q෌;b_\x 4n⦑2NΎ6c>x7M ]߂?4a1&3?ߢJBvz=w͓? bc*"KpDJNcˤO~2 +͸J'8TNPpDkyV"q [3cCWo2ƙd?0oBb쁾t'l1&!"%WJ̞M1i\ jZy qj8Off:h?(ʃ7.t <&pc,]]{jܜd+?cX<^;+dTE]Eh:PAM}SDslDÁLRB"i1(fb,`eS Z"Ru* 8el% ┛Є!91b"42E␅ԁ_LM ֋Ѿ+J1_ȋy"/VdH[}0H9yڮ̧j5."e0!t+u[Uo|DWU"VfDìgK9Fikz.RJ RU<B)qARk$)쑊"eFG$4bCc7Cӷ6śx`K7[,=p\K9^Z4k%=ƙ>"ǣI"k/M Mj T#f&=0iE0 f&=0YŹ8՜(fkI(AwIL.hfr Plno gsp5* P[D4֠L!>!M(i)e 4)}F #9N+G"Eq#JpYC!U8)~6khݚpep4pŰ[+n[@ 8XMY<"jM Wye{w fT'aHN2D ˧#1#u,8V@0:|ǗYwLm<)YUߴ+l< `:798(3cKO'vҖ'ݗdM)M' r[fWeASM:VVzB).abnͻ5h khWV+\C3MJb7:iϐtZq ~p^B{Z0}@ 'n96YCv B u ]@՗uCZo%wsgu!0]P!I_9P/?ҏZޗ#&fmȽ=|yzsJť܎8XBLk釸mn)$'{Čz%=^ kIc.=:==7ʩzu`bmus`Խ?ܳ OcβB1hJ;trNK$1o"ᵓRV&-eաsKb>K,jR}κSt'g!Z!աlcWFh᎖NW)Fu(Qyc{x'kN?u `-Jt)v qܜRiKS"$8'$DqBc#ix1:FN:IV)uĨC)H*u<`26 zMhi#ӥ |sP(9(JQ\QMƗrEH1b{"eEy#8-0}05e5'o~n̍ 8j{k0mid]*s_s\;[aL`.q3;z#gVpyy#&x#o]gĮDξ&5>1t,EI)N T8.۫y[vtW6]OfFB{5tJQHW ĕ5_=`gpPXּ=ƑgQ-JyGЇ&H%@>GץbMVAߡä́1[kKjeQ0b:0#x ҡ$&t AVSK[q#mLziTc(?428gqZHuOnbC@IGVj<9=8]9 @ei*.c uJ.}6]4 u/USɻ*7Էk6r2hɘ hm$E>O0h3rJuYf譝NQTzbr'Z]R H~gAѧ^w(rӳyĺ62é!+q2- b%%aL1ń#+V lCbqĸιdjVs(]wOɛmbw0n2y'wӆ|RbԬ Ͷ1z2k3{V$jxux-ÑV$&ˌȐy~jtu{|jj@к |KbKɗ(U>ZKʗT={wJIRۧ?xsc2+47@03lK6өs yڶZn✯0\.>-[=pK3c3p6,ah(\YilW& Bxq| QO/?L*~*~~n>̧\wsyb>z ~\7?-$|ŒXqy"!!,P1o_}^43VM4ڴJ?x7M ]߂? a^ &3?SV68+ \Vw̓? bc*"KpDJNcˤ"YKaҌ[$9OpHU`ER?@fƾ 6帰ya1@L`4-l1&!"bd1W#.8XJ̞M1iđR_M+y s*~'33^{(޸d1l~x]{ GĴd+oVX<~фc%Zɵ=$WjAJ c,Ɣ<:bXiUDi \[Ԅ|HGSVmbv,1wMX?)x7s[Kn^قr)0)= #G'^C|R/ w;}f^Bq$l>yj t`]I⢷ <~qLvV V0-_(`KoyQ d0q&2]םKrp/|d.-2Nd>VNgvZZF\s}x' »!eC3mlq)uFM$CB눊Plk L R1KBYBk׃Op%\x q&"b z 176O'.}Csqqb|-QձU. SO]P[/xր}TWj :8ކ21aZSҼ|Z~ j#Epz&I"FjV8d!̞fH`齨{1bB/Ѓ2NdNkpz@=פIhD)Jf )[Fm.aw!c Y*{.a)o/쳄 J˲9z &|$TG|'q@K u(@"NwO|]_^[]"GɰGKzg<J833ɯn뀐%6~77wA%6^?{W۸6إ1?3$Y (5e1nRGK$6ɢfMү/1"#N(fud5 nivEA55fXVhD:n4;Le=1jtD9٥b Qȸ2eU[FS4Zb!qȩ8{x='&8:\8mhFiVb0zk٢qgb@'IYp!R ƼXV> %FJl #zqw˨ )LhNbn6]2AhWR(IQ3MAĩ ^uvӣZa9x94 6*솊:D_c[ί`4g̗i}gʫܷ7/ߍpd>xc:kl4-G{y-S9Ձs"1Yxٗ>ؕrXV*c ۹z826$ō\ 8;aK{Q3qA F܋(ZxxWAbL5 -Rf\)w#vfV# i[rx76AC]ڔ'8(e$ ]/5}(B,k D;- o!/Yi$Zڢo25pA$aĸPz1]S0҅8>\#p ^#6,,Si 9bE"xzڕ@m8Jx׹͢cߐ.&@IÂ_{? >zB8:Kn wVr2"_Xz9ǷK)I|n=tme$љtv h~ء0{]: ǓQqPbei;>B/Py\DDsm8u8Lf84Ek$-m Bj8'p1ʜu'_^CDPI}.924<.#7Y~Sj˳|+DQ'dz/P)!( I=vm9{^S$Pu]IھpNS.x^$F-=l,xpڗ!ʹ$K[pp6w6;94YEi6 iBZKx  HY5^Dx$;}(.J(3LG>e0B (V9JtJ-0}A`Ik's'JAKٴX#8Pequ/Jts.iʛo7UH.2\HM:`>r65{c@ΦBA)ko(.)'qj.+ɯ"Xuz{Aَ%R_fLA:U'W^Wi&Rv[Fgu !Iw}c~袗$8*@$*7nW߈ "=0F=Z_u!_sH">1JI]jz$$CߞkLn?~+X wjy\~x)]d6kD}ĘQTU@.dkM(>= %V3%Dq%xp}!e}YVU108y? Ӵ@fy7f81oC_}G~w[!ف9rMoQFo5*bw`ׯW+{E͕uSD6/s⾔}/пy -W2D"lMiTQ"lqH1I_yJ>e%@'vdE` 2J@(SABl͢-qUv=J>qMtOF4m{%lPAc- t+?V՘O,@_86d^ X4U~8ۙAchьOV}|gxN7Ԟ7ƭϻgipşYm|.Ese:n-q,ǶXڞ 4{fY*Oʍ~\_V1|q7HYVsy",?s͂o}뾘}z{~ͻ孶H~X/ ,d??~^}-A967|$?p#?<_m8N ̾`M3"ۻoz<>b42gy=3 az_o%u1V,lL3.12eJb.za\3oI/^4]Vg^T#|B}aU~א]Nf?.Zqo6w̪xwץ mRsʭ!KU3fp\ 'Iۥt]iչ+!]y)jyJnҞk}K&E"`f]JgtEI_58v,c%&(me9[uIɵ.Gof0B%Ot5 qM8`\ds$;[Vm 4.1+p~2$d'%?ڜWRxu)i7dZGuRK2PBwrFy܀ +pj[f9鵰J7;y}g;Lj*ܻIïg'NJ'YV/gw. n1S )uf+UOnF ~;s4y|h@"y0NO5'M+N 3:֩<ڻ:1\Q vҠW0n1?*Kmkܙa˨[NFi0 rr=&YϻdFe0N,:%ŹZLScIQ0ߌ={c(陯zOr*g=]hxyJLjQGqRČI4uq|Lp&QLx"%gP(<`JJـh# 0\`f⫂_rzoyH4ܾȆ_^ AÖfHQhژ8wH3ayUT{;H9yyr7L5(4JQ7sFK(5-f|vl{{-jy 7a=jۛ*p]UE564[c*\6eRd@,c6:f!{U3EmIsOB>Wd'0U!(!/X}KۘȎ6%ٻd_5bD9``b>VD@ HqU,jǻ:|FƜ?$1ɢD֙Bh>?;[ /3PsIWBnU#d Y=if0Ab-V[a*1Lϲއּْ ; 1TS: P[v4-xYk8w"fm8"4ftlj wK(vWu5Ƃ+s` oXÁM!>H-꩒O?| <}kwӛļsl ZMPm]W8ܸJ8QX*-BgX 1կ<`usSaRYp9߫ NaKıY3=p3ˮ,i8N Ju:4VP>'}Ii(S|t*g #L8CleL4qފAڼp>)1A%\.3J),H^0r7=J$Dz\&fACƦgko:Ogؽ@r(_>|S@c3eja(NqZ8dIz*l*ųoUT"z|Αzv驎= ,~{}#Ns*8Gx7Þ[}㰧Y,Ȧɤ> ( Qt%,ZcՁk[> "W=HE)[q-Ձc[4vS/'sZ7uw>\z^gyUqryC7F h)D2H+û"㘘%}N*܂b11+(c&$ڬ`g8$!C7[$1G4 ؠقrg5|gj&Ml/9oZnB!QKdGgbLL;هH%S>c-UrfZ[A)Tș%B 5pb!?Sbb$d2ܜߜ+*'Pwa\L߫?\>e9:l- 2"xF^|VO Ъɣ B`/Hk ZGJsvpN'5܇irۇunRm ic{x|K%:X{w]zߓ)'+j 6jV,0ڇ9;WAbq &4eS)NOX9&H27}{ՍF-kYFWhT {?Oazyfv@.ϱNVy*m82D C/ Qye4,ZO58 uqGj <0uXc- +];B%"5#8د j-է{q6ĸnR+$1̓nX{ZK-\l Qu+050DP 1XjED*ǰTSƊh4LSh&Xǁ)Ԣ7(+p&=wƔ w6+fz3GϹ~%k%1+d=]wnbt.Xzvgvyrs4sQxijjɺ:uu0r9A=elQfIvLv`ⲦoS37x:Ʌi@[D2>^,:6Ƀ{o:\?h&waz0} fׯ17v~}ݞ?1گA˿;v|yAM@[ y)^lxi\3!}JvWرSЬ<l1B*ƅ7H/V傝o^tp32f ^\ HȲ^/D#^b$ݓ`̟޽;~`lBIQOw^ק}ua0Z:2]M|Vb-%Z4ïWj5!%H"~ Uʺ2C'W`qphPvc6DrSi?fgojot<^QwQw?ISt~tުVL~ d%ڋ K*U/jՏxDWidgHX&=cb~O7l!n_\bxIpknO]*=Op>Wƹte} o;L<h@srr __^Γ>\^VjtX|O4Bfм]|@n:=# hHnT aD;DR4TƄ0| TPRť [!k'_vw1+lų5dzϓ%m87,K0lysdf+8}L?xV8f&;ČDK% 1A19~6Xwx3*f|VfCwrưd9*4p+^YBP0fn6زg#v!Z86oPG05!=R+' i) h'=M4LO4zLDْ{MuÒ&"%DW9Z@PE3K-ib980# c\ȥQHDuD>B8(4hBF=$+?KNɀ%/w#y&*|\àtf1H3ztU0jKXF^YwK\2"@P,z:R0/[%#TM$0,$(vxۢq$݂ TH°K,$RN"GSds+sc- \UYR Ud}4mIDrYʩS4ϰ`GSiij$Mmwz`QnZ%\~Ӵ)C{t)F*̤gFA`ȀyEjOG94|wP#ÁTv~Xy0#͢njfRc$yxJ .Vx<0" P_NݚatE?v dt$uĺu<~Rs#V'h > K7%9$ƨ /mŽV!21I3Lwkq4)"j7@++#Z9-hs/t{_w[{cpӏ@] cP%\} 6s`tYr jՕ5 ~tԹ՛0˩c9ad[%ف;Guw7`3:wb?=#*lgzوΟ|)ݍ# ;\Jh`#0n8\'1t~*{FΟ|7`w70#)R(7?DSz?ϿXq_݉TճHysO'n|x DQŹsr vwZ"@7/@(d&PS@:g#<(F^(y`}jnWn>eB%"jh,g?)Lىk^N',Z$C 7(D:kAɈ *e^+2Z:)PEJ AKx97Ww..9%=%.ܽ҃4M.Wo6Jg_1yЛrg_s?M8`*jIOhgnyELj1VZ}f_gNd-_i 9a hj఩$RB :m4nG ZI֘ػv+hv!/ERl [ Ӛ|DV*ꔶDqHb`ڭxڭ pqZ|ٯ}9XH{;Tr2Ƙ 96;\> 2yH!&ZYjk8V-7g^y'*z<4RleRE4)!C3^` Gۖ 8Lm_4` h Jt-1ݝBdْm4rwgxݝҁf;0SS!6zsKaGa :m4n"0e4WuN"N[qff^QƂE멆`$.0eX-" Fk,aUkVJ"8yG\IJe3MKhqNkQ}$)2W.)Z)5~$GZ rSFvuVZv[@s[ y-T[V}UJ$Ԯ -QVqG`ziIIud)G^ Xzfef9B;. e. vMȈ$J 0Zs-Wa!S|S znq0Df^+=R(6bDM'*zuN"N15dKzz#jRP&`;+hv!/%Ź$Gv rSFvL/Br:\v@s[ y-o3 WF$pp %'&A36acp""ה5uaR EXԊhż1bN)b5, &M`TZpDޓqdW}Ydf[0 63_ꔙPb }_5)EŢ umQͮw^_さ E6WH(M$!91Ԅȃ |Q9#@Lh&l@H7bM"{fĉΦ4 S?{G^k^ ELar>zݺʠ}f{ iJ7\cloq5E4;kAB^6)>\nRӐYWm=fL4F46N ELqDUGDk56^e,Fr޿{x6Wޯ++j~va:9tLy(ѵӆ#[)«lK[|7O\9Lk:1W6[t_6/i yq=ΟÎSkv[WXYL pEjf2㧣u{ց-a`?&+-3FY}3,}(0`FA ^h#r0 g 2b.÷;'< T*Fv[傑4uR֘E|XgExva!AsMV[x1Lˠ[m[m &C jy3 y.gE% 0"chyL9}2!飯Dj(,Šc,H0J[f(˧~->ۏ}y)HV@yuUx Mqxȋ+xAޏHqG>+?hl1G#HN෸( ξ߹\x '7.cw\ t NXY'Fafwސq;|x`Dqm7Và A ڊ t:?^c\ԕNfMy#+E[bQct5΁ کVPÇWE{mvO *'kق2) }; &]#rrw~dzתĕ}>=:6О,>I{'9o8YyEgD> m -#W}7{+`ې>S}~1mfO(P&o˿_0F;I:Z!LeSZ&:|BMF*mΫk0WF^ODTƳ`O Ŧڝ' ŝ /3+= M/>QňˢE psR+2r-3Tb;2`d@ u=bVrZ"%^)gEs4( ' orix| w܉TnZO{H-ulL?kI~6E"6F!C 84hTu0K*dQc6J1$Vc'HT){ӫ< =a6s35~At^Eik둽a ,ҼBi ,Ŝ`h(jX VhX r J %]+bN}(de[ĘC(mWW9e9 j$B9 @\;}ƙ`ʕQ u"hrXͬ4{X_ގLB6܌W {Hg  0ŎB/ F"ϒD%` fqH <˭a )jgF2x jGP@?Qcb6"H۩;>+R :sziV@|tDl0\`HF1 -IQw-kB2Em2ԘdWƈ=k4ę4ǘ.]^\CA+6k.rZ9>Z$> `+>  .ĆpB_qt%PBDE_RHEGTԬq u=h_C]T'PbT}"Af n5tW_^?S|:$Jw Cx{Qd;fR"H땯vi'0Y!gi p 1vy#gI4ϒh%?\H1ʏ]HE8MAh BMTH(V)uhk8(N)j霾1КbzAf6:. YJjq79a=@(Y*7h$%i@&,+oh~5}'!kp [{1J![[cMd>A0vvMQ(aJ 1sni<p>&_OX-Q:U|9:wd i 5AA "s1̤?lɰ|srBuJR OXVBi#hlf!0:,DOIJT/1=фgL-Nx gP:NCg_3ʶ\)\op2?_Ngi,w6z8W׮Hͼ-`֨BŠ.A!+:lkڝYѨ^q5i;c/9`sN6T:GcY[Ha> Ţ[℗!XP6oW|8^,^ٌd# a|-6?;7ꚣZV<;Wyhyչ1/TozPb♶*̶m˘f[g1%fw7rvӽw30ƒ(y`noW"F Dh,C%FXTnIeâ^ җwO2ȱ-N#hc)ds2iۼS(ј,{8Z6&D@-OG:#VU6֛ܨ Vy uW3vHiޥs{}sc3`+E݅faVa iKE2$KigwPpV]tw{w.N /HM??5-?A@W?[F7dގIg~5\}.僃Ǹ-T;=:6О,>IJNkWō# 玅߆*Q>+U[p< 0-u ;4X@0.o?c(WIu uMz3nbv[s'ap9]gkqF8/;YDݼ_pfL0f&l2/ $E2-nm-W`jbX,λ_.b"o*DBgLFK0}bnp1" +͊/b@V"[x1 3Lկa{ T 1jE[ !mfq3ƅ_3Ζ½8`Tr23o.A.&|o RHqM7ʇj _S }x}ɖ:1$<\,M0j{vg:WuT9Ts+/!S ^1e3r8 iQGԃS݆oШZMmɎ|$Ã6\ጳg{%*OG Ωs\h.)jgk!jOqj&ml;Cv<W@1 I_/JyVʂ gCXa2NfL*+>7c[P+@=-мP߇ F[ᣧhJKn@aoNQeNCuz ɡU&Sm{uOr .NDﱢgX$F[A:&-,2VݒU:zx6[ UXO?Cy)ͿyULj=FB%2Yq@0%Ɗй>L}Ota_?B$D־yؽ 74l!qeTO˕&[٨nǚOlݑ| Ӻ}FǘYG(.݄%P'l{&,̥}Dp}aqpt1<$dy픗YhBj X/:- ;xeM_|\| >|6?r\y=I4',cy&*(x``pgޱ#Ȗ CyW܋/Z c͋+$*pL1Luj00Q:;JkuY+[X>g\(2DȿEs3s$1;J+uZn;XU2C2sdJ#.D0gFJF`t4F`%>ut\RDKT@I<%ْA*5թTī3A5;rQe^[d^`\}pP 0hVN!8η]ዹ_ܯv_ӹfaѿ9hrl&?}a'oWďw{ l,KgXFnl߅hRCdcfb /|~RkwAv]YmZp,S^ * WLsA)ǕFLYY0g-eWԀ 2gO+yx;.ZM}D#{-̼ۜZ(e藟/:V߃1er&Dv'5gtѲ?{2Y~`ߕ#U؜ NRY书9b^2uZ"4-XKkϾE;ؗ- 3(&q$ⷛAJ IAdc@J#V.u:XZT(d  dPy3JrZ G&NXX򰫩EXb12`C& H(q-0Xqf=x& S좰䜑בZI"$\JBUθP͍9c+a4A-PdVJё\j3 J$ p3#U Y&$+h=B? ik b]c@5k5[BrRoI 0s)5k鏅P#MYk7dRPT4P:ט? ۊ6C狟7KaT-|c$CC'PoRXXΩj$|B ,1Z6>X 5MM9ZӸhv{iXTJ*@9ZiL82譮ϬBXd6=UҼBpx rX0^&j.{lT ZݽӷV-b |zʨ.,lV)s!FH+u^mFhϮ\6p)=`L0.@2\X3,~ׯXdлCEskW`wLN3B +GQ2.0c=F!S.!_m.rWazXqQkY0!HD`<ϥ2#K iNv58r2/HkEE(qT́S$rc%8$/Yhqcs_BjiZ+Cp?+ Ź- PAyA &!PklAY.(C!H0UjZ0J(TN{%R3(c" k u+GI8=|KR&_HXV@:3fVj[|׷Z~1.KG3.=.R;N6 ,JbC^C`9=PKA iyb51eZFj!uN4v VV^H!E` QlAo]N;lRO`:FԉgV~5+|'i6ŏ) y6 MȯG+0oVAH}MZj,q(B2Xgx*޺ywVSdGO?ՙ_l{ZtkMTvׄ]EqWǐ}i8ȅ#,D_[vqoQH%ҽiŀ1kCvU,d.Ŗ в+v (vaco[0Q @h q{@nYN݌լD W_ZO+/Z D'1vbג$ ^K()9J*$綀Gx,yaɂ )qcZi$""X̄%1VRԢtZ; R{ʛuO+GjHU^??oo݋vS9(Iy/?ĈRi;"5BxKubT3 Juһj;ʼnd]pҰ' MBQvWP_ɒ.{K*l-Iī#@LiW$v!w!]2_>>ı qW&9CS9ER]y {2cxQ.z.- MDsT*ݝz4] w3UZ)͐ZhݵhE[fj)L׬_hc\{K[ڿU|kSKs۬Qdovw`+Xy-+2?Npj՟:gt9AVʷ{zQ\d,_K!oꝖ (m" oI^?O/w 5^۽XZxe6D>-)Ikm24SXoۦ{ 0O0.^~za7gdB5ĥ3S6A濼;4.qݚC5ft';r3q~<]?u7_}?1Gx2WnSg[5y¨0O!.>%od(zFF;dnxJhM_Vj9UuX=^k1˩l"纙*t3eT7R>OM;"OlPlqZ513fɶiMYnOSy᫦{ pBg lnZn;ZWn Tp v6DY ~Y;\^pj%Yܧ{ n\n~8/^3sb(fcݢso) Ų7;ZOu~z{,|76O.?: Jq=wۧxwE'Lcx5񋟏/lf\wSg f+1RLSgD{eg U27dHiveH\47_ jqr0:y=3 wsC.߉UHKO3}ZSuat9N΍"P:Yۦs+=rqVzw'Өaڋ])]s6žU2.HԶy-bn_#?^9g#-UaWW_f=mIyΙE8]#']VHdZ?7quxh:[˫]0\tC-B^nJ&Ju/c#4>jXxq*Ӳ'ӥG;QI|Ysd;vd$윝VXciWISl)wVU3JNn8Ma[r_\Mrf훅w_#' ˛MkW5ZkW]%+T4MA7X~ }'0fQ44 7}?Z~Oi: pApxyI^-)Eڒi4 - }r-.\,-,U2%ʒBŋ%vdReAW rJDTIgR4q:="Re2dW={q: $*t%c \a@;> (`x{'aH*+8iGlnc}ۺs< gO]w#%?.Fƒ V?}q&q-*-3JNn|F :StY88Fv\`?17jhQ̧g:)kTE%;RUMyԵ^%"dHO H2P"؇t ?6BtT MP2^a"E (.p_Pi R%f)!tD+:OJw#b{>'v TS$aPD'W8C4m[;C:FP <{.C$i^ tSIPNmTN}IX 7 ѲJ":?EgG`O㇊'Wisw2ihcO41ÊќLS`,ź1(DtqJժ{MC%.*)??tR^Qٹ58RenpLN@.N, jR- eJ1e(OAe{m2Ψ?ZAF|CAUE:>v~݆n k%䝯a8syQ`h6Xjϰ}a8aۈNl{d(?Ҝ͸.2y\'yF 3 0Bg#4 H-gxȌhLZ Ds;8e$tǃ/Q 7J—]Ձtwr0),&a %8xE7_1#ŸtQmF궷0PNErn%h_1ӏ_71|C^~fUmOȉ'TSd^EgbCy.qYj@N7#y| (_SY4cz:_K6f_! @<1ń"Y.q g.)KI]|g͌G1=CKA֊`qH:d4qr1* ;@[줌^ D3)X0'>*N-L+ l3Ss<܁. A:'oGX8)y?ܷ莕0h Rel8 p[)쥝[fjlg~l_eqm H52$bmR{vl#{D0+! 2rKS$L6F;w9zpCM@\a4s"mB-+FT ͭAi52/6Z|P$W ֋8`KmaE4,x/q}(.Un8b0&Wg\^Yt3uDݸYs.=sQJaZad.~H3DnDT}ZL jN;6n{܀[jA]րf?ѺbPGtr߱u_'+yj_˫iݚg.Y2%g ׭tѺbPGtr߱u_&8Tw˯״nMH3,MUr4{nwlbWj;-{u[E4Ku5TSNSQM@5T-=UKߏnOک֨&H5;UNf5Ao>+k2*kZ=a*kZ*Me8w=u458TY{5uL*kZ;|5F٩v5 㫬1v5 \iINSe9MК1VϮԞ:!8g6*u{:ntqa{$\׳gWuݫg5:\R?KIINӔ{C8d ,.^ ()sz$ ?(ހ8ZLO  #іby+e^<I0"b#QZ;N1#@J((5hQy!1J%3!$"s)Cja.(Մ- M< Q"$I sNV&30ɢ+K5&2 5:Qxk7J`ơA`q.B(2eƒ.Sx@hrT1&i[ S+z&5b% + @\{ǝp4%R )/: &I6xkUBh0^+x,s`#P2~00B tH sɝ<Ue*{Cϋ8LZ梵9J\{Lp` >S깔~_VFݾ؝5y /mh-.ll`wۗ%ih*"W߷*LGWozNEmDW}Ux2]W~B|_Lo?{pO>D[ l7_]\S2a:">*>~Y0>L2GhAI?zp$0@ {j0x Ǡl E᫔ +&~J@5__{ZʽkJp7wx*;Yz16Ux9\f0ƂI1wi`W9L<<CPL)B=brhtX\K0b؁ Qkr{&騟lrTH\ B2#*("hHHO PB3\~~H!ͦ~a)|?,&f[ d2SmS|^Ai S ‡b8X憚1R*+YLۼ JL/҆mf)n ܯuhu!ЁHn>\ )w_TU޾Ց8w`:]7 na`x)f߃%~ƽ1|5&\ܛW`*Lr ~z4/y h{3#ʸܵ85v"-nEZP}s2%cl9%sవ]r)N8z~'֯^s;[a}艹Asz߁0=Yb艸x2gɟ'SNjB`{J4P*evTLf&W4ý퐃iK_i<RUegZYpVFE1˞30%,ſJ?I5E6#QQ]cOcL=` f~ц?o[Uܸ7lKV) מ`3$ui#ai482`-:")uɨsy1/U "\hbqjCle}HR-VJLꆞ~2`nQI_cETGrΨh>ZҎp{ hjez,<.%Aux^ ( pe(JkFԹObBDHfڞ @@1w;UUٗ@yG +};6Fhd",+\S J"ʘFFm[6VxQ9`k†LkbGN@:rm| p5 mv.m)I3ʌeX B6@g#l٘ [;*;Z`V96bjOGU5Z`23o7{ԋ7s% ?;I"?:@ |ġB0C&a 0MxףZ(Ӛz) V_|N?{?|d/_?+s$IeE1 Ig_s>L_qF9J O\Aaii"5䧝G޵`\ːeѻՠ?8\$'wQ.ܹ.oݽ1fBT&Vve~2ɔZS|!'7B T\uR6Yf9&+e cubZ<17?vfi wae[nVc&9RzrYRdݭz=5)Xq^9olÌ|MhgΑVݛp\W7O/^}@& Xҽ>FrvwCǕھ= Yc J`;X.,PFx↝sA88g[-G{QΌ[uV,mjۙgOLvir_uMD+ڱS/zt [O ^{\~R"&mf-jL[bg@;P$*<Jڹ 1Tu uʎtO6ħ^[Oܑ| CuP6~VhisӃnnnnUL.PEXZƄ'єGQtШ䜒 QD*aLneu|kV`:I{ԗ9:֫N(@QY^_F:!%&<*KB07«E,-0\2,%"ȑJḾIT?#K{X^bOmvOi9f\0E$ IzdǑE2LVwQ#nXH-ᙫ&uMQb3CBI]Yo#G+^v LQy gy> %Qȷ>[͢Šһr`ZJEth-1X@%&h?zGI5>fM?>rp(TP`YSxbqtHB1Ь<.,{ Q#Y1K+|]ټHeB/la2Đڱdo")pgŇnY3-|ŎΚb- =.XZ$K+$nֹB\㲈LDaH_ Iz)Sƴܥ.Z ^أ(MslbkAٷ d鯊,mrBpa+"p=P=x/[F(Kuz9a+%8cst_[c:3Q'L ,׬ԃ (eW& 83,joߩV3]clfjhLt15`}N+E_)ɐ0"D`9r><%PTm[v)oa^8fRtƝWv`v$gvξHB{|)5}4c/q70+uL`Dztv,۳UO_ktMd̑^U%+5+g$Wo>AR hmώO7"CDY4a!>&cBv;˕puJwxPl/ Fx> x#֎Ria  K ),R:$1V¯brRkzHZX1߭"z7Of6ͧ3+84do -34wIJHNB -c5ڼ}y6 S7OtxY Coi.YROk0)+ (홐β!{%伍]{l Q8okn *'杷~]y8+'pKM4ɳԘTc; ܥ@'sJ7 BPC&D0])cOyTIrYf%V9[7)g&lݤjVDYTj]@H`baOTFjS[)!bΑ,dw\jV>`qlTSu Z ?>nRMһIYz7,>Z"H ,J0"'J08iX1%4(NZcۇ,МHd&5U[WQ7)*&EeߤjTvUQE0Q"QYmvȌeNc0BrI-ҊEȖj%m~\3|Q*^py缝kz㼜ymMV&Hj!uHב)d[' UKueaМ@ZV&%нhm3cv:P2:6"H%NZTp!-& Mu R@&c! 9$IR'Sԥur)֐ ),((7R""wƙRXiBamFUXf'Jj7Dm#gT[a"eIQ$iS* Vv*ĝn$AW0gd ImbTb _CJ/:K l2~JVG-';iELG >`pHJupҀ $Kʵ1nHJ 8m[&O x> ,yHPM^p$<)8FˌA8Fp98u;F)=3T ZhοmXőL82BXhk*d6p0 X06+]M_~_M8M#'|uA~ppx wUo;|'78Yl-p9l5:~ Th wIB51@ݞN1 1b\7#z@v 5@XvrJ|Y4e`^e _ TNPqXVL'8B.|jx· Ik(,Crĥq+*] DRy[,-bI"k$Na-S@LHC<٦ۺ{œpםz 4VNPiܻ r0F2憂DF2H`ZؕɈ`AX#/ XWS զ L`SY[}GaaY aܘ譠^D/U:R ~@[o>4J}y7׫(@`ͦ1H2>M>:[5cE`vx1y96gqRQN7oGIR L+u%zCy*r@;}) 2򥾼6a8͆/q |<aEZjڊ[Ʋ wy<@^2Ž->?q3̑P\W),p8SXR[w\x&kLFUG̓qurW.b'ӫ}H4xVdFsw< y~}}~ϟ./1W =ra+~;s?Wiď_~05}%i/žN brz ZJqr/VlIȾDMyxw}-QH[XC) jx.^LJw^0{&\ AM vT3]7F(E9fbW8s:zC?'4A.V†k~)ڭ 5.#ظn5T[̔aCj|tZ v\uvbaKũ#?7-l +ԳBik4]xUnJK+),9Ma9 =d2iOυ1-b鐝ARiu3clj{rG˾žMp, WI:`sfpXUUt7 -PINR9=ʟ)D6D *Js%4|%5{5.~SK9\9O_Q1Pf 3aND J}AsYJ k"<$Zab~;.ΤKbCQEywWCi ͐ A #0#>:!R| c T(J$%A#= [C[eHaTIXȁfZqqP RQr  ǂrٍ ]f|St&zr1H"e(40iRuI e2} U'T0g* jL5 y-NHL #=CyB-2:o#x]FfKX36Z| ,/$2:Fz+,ay 1vӻbruy +vÄYS,6 9;hR -c9k"s¹/No2_.O'0ULgI`Of~{}qL/_n Aɑ^1<Aܧ?ρڣYuϏ=\arO5̜[T.f m-Yoh~ w, QL$ jBY6 O (QiMF X]Ek!%ϳ+y~ k 8ELhPJݨ/%NLY|$< 8i 6sRǜ[JWfk|z5ߟax?[PU̳y391G? vqVgT[N!jfirU}„'Fl'i NBV/~$m:[u.2LD%rٿIbӸ~_bm"a\hsm_rGDJ1jVՃY?cjfk/>jf{QMMKx򏃯Swj8EtoNNNˣ,9LZ&<:xʻ6Z?-|K?ʻ^A5 n@!I:Sy  ͠oz=R0ՙ. +.p8sx xJ#7lϓuu>UC+̩KO6A =/5zvO.5o-*qm&|i:^H85V̚g7u-no="B$Ri8ęQ5 kb0t 1oG?q5Pa]~%Xʭ_~OB P:Vb(Ь%G:AO,kmH /UwsX`s֛͗Lw-zl'W=$Kaσh%Q=UztuMW)<bl^`Іwқ#-M%ik2''Nֆ$Ku$]N [)YIW >Ŀ?;nUOn3mQ"t#3CmH" :&3 -{ĸ05Qqa\X&lA2] LqADu)i5C(FSpbD}\[ Nx,s7uDQR'[M8YţsJf|WevJN&fQG+MS1мe7ndŨGء1]I]ɰB"(:s( )änw1WtDb`vZ ʻK֬!I>!dCӇF*x"\"s:& ~,Qzyyf q1ˌc0֨"1oK6M*莒dd<(塧d:|~b*{J]iG"TBN7wIK^.u ҝGYjwOG/8<N )y\X< qqh٫Gy~IIo+4oLo54m+ԄQeUfYᑵvO2zagς={<|;=ʔYR*JGr ZwZR[iS[j$xFss@2ρPNɟ+Osg6AyPٻ(X_}~n?Bs`ᕘNׂ.W+ƼV,( e!5w$ES@< ӫՊ%L˰l1"V1&yphܞ<40CߘXWyhf[O5BaJڠYuFra+k_=5-[nTGr9ւIɓ^dnW &04!VԻ4ebT[~4 0n.>Ň)g^ Ldڃ%ϭ04EZ,Z,ߛz4^E/<ͽ]|G(A\DE" nߴ,ں+)5&Sգ be $]qgi,C;qGPNb׵2v?@s9pRIK>H& W<0ٳME5 r9*ty4^og{+I3kG NP/ڞvvw?͟gj B & Y\ㅒ  Fלk͕)j"_5i3FIfST,\BqCwEEƬ#w7Z/P%X5C%-[Of,Qʕ~HkT&3²=3oNFvq?UMe7 (z)ۣՌZ0`d4onYNrRK[]NrSdt5GwT`kr)B_]΄IJ*b=5nZ WFMw͚(xp-m0 Ipl] a.M4Q2(*Ⱥ&j!FAM hJؠ5A FM5 k**)HfMRÆ~#c4tgW w_bK9s (+pæ i[ͭޝEDas2`p-BtEeBO!$'dܣ3%-£Cظ-K JQZDhLs=Sє"aj"-KgjbyY+[RZHb.Ǐb+J`ϊȒBAFÆrjɌouj 0?ֺEMcVSh~֨; srZpEJs\\@Y8mehY B@ZOւn+(mơOer`<좚j- M""40cX1)el,^ӡ?rxF\5*h7%QX\DB!l[#2׸YD͓xP3x,"H9 (٭AdH)X0<~u mD]n/~#"7繡-c1M+!2\2{7{ܽ7P3XB#]bf0CrCMmg3a oTS= W\fZj؞ZMHHlH]Rˆ us0A>ХA&o,:ϧ]}|ۯ%H3_{;σ3#(pG\}vя痳oF+'S]K ϲUr?ݻ7DQiW ?*ytu6d,u#OF:'tjI]~ؑ7Ww7?I ̧_ٖ:7R1<0twWond );s9.s6ar3e6O?XϴnD9x!/\EtJɃ?{7nDN3y,ϻ%_n-h W$׻&Zn_~@A>w025ڻnmh Wu }`,z/(lWlA?h5镻=) %K@2IԈ^Eٖ|?_y B\LJn j!Bqt60 Ѽh"|CF  HD@7Vc04̭q;#UDra9eBGe)iW MkTpPijz.u!Z^;u DR=NH@ErPe:44qS #=@,&[ e@G2õ *{XL$fL1Owy'vSzWf#=G*ɈsaPDBRosqb~<'az(2@K\]K2ȩ7P$af}+ZI#g3gT.QiVK3iiTZwѳZf]heVKCjP-Ӄ/B s(kUIc z[Cmk,$ } \;!3TSS"؈z5'dSSMښ+.B 7'mߗzE%͘ш5Ro3J]JkuϟWQ h(_Ng}}ۧU]{*6u)A v=nAG=A$R.F5I-Q$A$C[sbv 6HH'r<&F #8z_GoB|$\߅Y@_x%^g+4:YP WRTE웫uZ25cr4]jF[~Ⱦ?.t 4$Y[t!DQBQd_>ɏMB5O{sY?f͂NR[k5(!l{+AjᡅPF EWOK, -5nvǬXk1͆#ܶOsM We>[H>vA? )b>'?~r;?mY}* =op}:\ 6O{zo2EQMOdΎFTJo\ 4{FrJY77;s~’Bv锛ۏ>'ЫަnISeQj]fF|)@kQ+t"Zec)nnCI+tT7&MEğOٛj&aZ}PQoUz bMY ԈocAjĤnM5tFSgsɕRٖt{)Qw9XOHIs)Q^6DM4c ^K-:&EܚYT{$viZhCv'w'r%ZnBc^Y;rĝl5;bax@_l< ?R0pܿk+ ۙv˒0d٪isZ`4w&fEIh5[Mojn=O.J+O:YK!d[}hn?dcFoVUWVa͈I^L㖝'< у!J+!:/F ZĻR5O*VKV?$EnWs6h$P&0סpK iCu*Eϋ,Y$e'L!&ʹuRs<%MJ @ˈdc14S|PI|dS?MݿK.bdQ Q:O ;@yXkymVql9ݧ9oWoѱշ_=,u*oㆊ]kyڋݦ,uz祻wN IKKWGeF YɋPrBh Pa>" SzDH A )K,KT$IAeAbL8c!Ƽ,tY@ȴˎ'g2,DPۣGV%AP0ݺRdnVV*>wR༸8¯祹G}sp[N>=֐Bz: PFZ+!& [?\ODg!V:wdN͇K2._ѻ|0Υ]\O}4mcÇ@A&c K4/%+ħ8Uދ7wwZ*E=/ yQ@ h۵"^$FEf-4Yk !:Sn$d%uu>ʵR`C{;7P|>~ErWnity!&EٮvppC;,yC!!:Zxlsgϔ*+*N9hFlh:]x&e˄W \Hq6|LXt QH(4v񕜟L;w7*y*y۫H~+sM1IOcy-I2f9uD. vjT>|mFӓdi5YP.ZuR&";ND\Rw?MzDlϘzrruW*A׵{N0=dhQAp~;m1aF?`“mOTG?$w> '!}?&y$kvӿ&9>h𗫫g=jy0ۋOw>Y-%mskJ-ιqr􁭿k%/?s.)$'+a$ONp__5 #o)?X jv~&oU^U\\.kDY:]e1 1LbVpUǐ5Fz."=ͅ녙xj:p+-iL7!I%%Q" &Q9VbfQjuY [}6уR)Xr.]&#K }8I!o mJO6} V\Dڕu?v;}y^R{ > a6tTJH9YFH/d2"KIZiYGo ǒT aK 1w@ i 7ydTBP ͑gi@D,A?'tYR71|. R/D>dLR@+V%g9n\JB )J,`bZ$Byj`+<A.2VArw&׈'AbVLQimtLtڐڤ@,:霑q}0Rk ZN/6@XL N7w'"y\4)2qVd܄qԺX 1 ^ 0b {(%HUJDѶϞGr|h}B3OOR?ݫvv6!!p)5ͶnqwŠ>O+´㶽UO 2hbkq)M/+fp}{&y7݋g lgţ)lcHrr:gΌeY uJ &`2i'IkYpA9׼rgZERk.-Zɴ(p)&g88G6bIBrV6@A!ݺ qUC&ہm7:+th{}{7t|\pcGZ=͇12 *}"ծ2s`T}+#{hx=<~Iuwwo7 {j-7>ĘY)T*N60Z!}k+R%\ydJ3Y#J^ƬJ~qz[tLnk'LJ*CfsHlE帵uU!m7d?*vЉE+qQkĜ%7! ~be҃+[`zZOo{v=ӹދ 6]$OގlI{pd;t;.NqGn  a4{x˕Y"MiBr唏61 g4:㹼P#Kʢv:t1yw z5ܓWj@#XDpL+4kQr:=L 1GEb3) FBV$=`O5zc9"BNN+Z>YeLǎP`ͅ_wJ!gQH,Uol/ e+9QcL`%JuSrJ)[6ьR-y#:;%#Ϭ~n-ɺn^f:Q+OG_FhSjj)]]6@R=2P N;CNGE(Bܶ&/1x )Wb1398lHI% ByD/{tEo Uf̴13`1c\J.e¡ GIxDŴ#dc>Tbb&sL f#mBh,d3Irg:g%- Vb&3:# Lhe fc3cքu.Vb&m4\G`O*DÀg4F$S_SibAy]fRk1SJ`wI~/KV#I!Q+zZ"S-'_>ت F\ɖ|mP%{tI5fEkku5NX#|=ec`kԈ2KpZHH@H&RYi fG;GRĎ҅Ohnt4\虤N\Rg1,4|!ֆܐ.j-4,)q_\PǞXp?9?z7}/%%ЫvO5 H+0ѯKm:T1֥TR6qi4a ]tsOکvOf`>+9ZdO1n_^][o9+B^zXzy硁}f f%Fhˑ[rtߗ%*KU,Nqd?x@"ϟv̿fKAV6P?ޮk؇t1ryxBGQ7>'cCԳOg{/O}mUcFeTbwuH14SJ5‹7 C~-6'hH7ň.|FAP1:ϤwrgS6|F)!{7ѩtSX-r Nx?i{r],T C~-{O`; 2ǟnO * ϴ?FpZ[m\+_5)C"ыArA0.Ȩ#G/6nH/֋ ƕC]._W ~W ~m;PW54 =WfhU bo.>=o|rJ@]"+(v q޼hwVTĿF-jH;.cTB}8NH侣%3jv6emm+-{-&IK1ַ6F]ꔱgdkwqB5g>ղ`@?%Nԭ(%TJoєYGMiM}|UҪzp%`?ȩ^> tcmU`1][ߤ40jf*Kuue%_j*l?jy\\`J+EJJW&o$n",qeHQ)*;\PHƷRN'R޽ w/>|en܁Du"IƮO^.Ctv _@eh0ʸNhQjݪ|mF_ԃ5zw{kiq8b癉ڔ)6iy٢/g.S:؆w`/8pΩ IX] VO'Q?Ń wnb D0Δ|cާO"n#c,~y.>kdSQ7=яB!Rc6|lc#D<{G1[jz,#05rkB}WUjYTBHRFSNqQT\3cmEfнal'b2Wl*^'tE.z܄/F+AԹQӪ0&];xLrL Uڒ)Ƹ*RJJ YJVViKL]ԕ&+BcIz[KRa1ư@U.(Z)(Zq'+؀fpƯE*(լ:.W[`bS_N&L_lqq}@ g@ vƳ. uGо؂_۫J0 _. -틽3?:RK@x=]S APy#/;/r@ƫWKB]h? 6E;o r'B+ye;#lZ?6&TrUnKuw"HXPCl$i^A~/O" |mWޕVU õ@Ǜ}dzf|]O|qbwk;l;zrsIHJ},aWurv(Ῐs9[o|헼WUݩ3w>1ҭ;|~3&|?Sjm_- 8~/.lC苍ᩮwy!?n )}ʹ_^>7>5رhݿ}mi{p{?qSvN?l8X~kV9~2|ܫE434CP\rˑO*.azlwIp8*T @̭BJkAO ލ'8d-vr9%h_ ͸P $ˆ[FZIj°q {{)XnC ˃;$KAW$s. ~F Zb*cOrKAq Z( Ձ`!E Y )8K"**+2*Fq|>-#/R֠:.J|kp5+c|E% ^]G.mzG5i{ #Ӝ;: /o'$MVg:>nnM 1UeLJT5c>m#coJtVD-DF8Nmř9M1>Tꩃ寊4%EKTbRe)Й1$wM?`- dZjs4yq{qޛ5 ' ݟ iRUtx :3Ʒݎi2ӚӐØvz{3i;iQ(T[Ldi]ߔ!Shӟz)V8u0<AA=XHks*mԌ@tXuX5ub Zu'\ Mv/^.[$ѕzXds&þXglhHpReL'fqOfO:&qyomD14;sR7LD&b%96)\іdjLg*b TKb% 03YPa}=&a cOq!Sh8oRةjhZgSGd{PLNÞ\Rs+6BU>/*P}=YHuJ ./n1]؏m6nfzkB~cSdJ\6MXVso֫?2*|Sw*|\ 5˾foWƷ?]l\N]f-]foj[I%O&(FfcbtUЀ ".yQ-VJ85"€;$N˔ꥻ˜4?cc_)_I 4uD}g6ZzHrH-7'uba |(;I7YB[)}$pN~uUƋ,]pjᕳ D+祻G HTS\8.^ݍ=vO35$`-r2k#t)<ĥKzP|>ǁ n/nүOwuWDruPT?UAll>{ ԨqMy /C&8I#s%4YR[+tfo0LݼkG1ﲜqʤ-m8UΜ#{Osr3!Q/1_nQu*0QUȗ;_̵z1p82#[]eDdQ)duVAG dzev۾zUE.a5/IDAjVHi*mAt M UN g󶦰EⰅ4R1tǎAſѓggW%n<)YkӡxhL:҆nU J_v͗=n+ 0Fn`궱e_6dgn+$ٚ,N8D܅LLD}75&bc 8c$yI#s5f&I$K `L &Zy(=0)iO9F {ڸ9PD=˙bVK\3ڋCd[arƃ_LK̓υƃ_LV?i#H3^B9g~% Z//Y|>Siit?.C/kz(ԫQ2T"\gG|bb ʊXch }KXrsU#n+S¢.HW{ܺ=8 WFQX1t3X҂< lnJJ+׃ۄ ,SU V\3nvȄ!JTXI) SǮH뢌BfB4s^Zfm2±v[)ٲĴ#8Ţe YcfEiʖ3Ju5l=:i]XB#2pe.%î9ϚHd#UTPP=,|G28o1{zDiBטHB0GxAN=7W9IhK4K![(UiV=Wt@:#J27# O<j!/-*p0#ׄh-$u&O`*橓zzgH&_vE:QI97 Ek1d]p>T @s8K?q7NS񯝭<|ȝѱFrmscf78 Phs'DQ+\ۿ8pu܆`:'7.͆&9ʰ7c;R5"f)V9bJ ֎(̩.@'^ݷ{{wY<(b( p)az N[;0φA-Ò(rU*;H,{! UǾNRL|b =M b  pGPxOY?`py0iBTzgs'cJs訞hcmtk,aJqf٫UXH/$QsTJW.W,%x^R!8)Gk0K8&'8}0QR%\J0_b05|֚5q#FvG+\"*I~6B &C`>n WTJ"xN8ReN/% EZ_bb |XwT IEX V>+ROq.<!Ls8ڀ$Q$CN^C?ќe*6>J2XSS B$A(kDN '[2q%̹])Yɘy|Z%JVjO+X @E/X R,hhߠCuX?vD2#s⢋Ɔc<ǽEv%qs/-AO!<0qƉ{y57&d'.Wl<^0ux9jE@q'݋ְ;fxO~9? w4Ssg:;P NT֫e̖[qNv;ojaƖ/GdT[_K{`jyQ7;{10kΛ[ot]sA >XLsCX&Ǡr{g21Y_$۾~1W|,o}xR(" ^خώx?lKÑ9vwz]6ty?Qb poo̩I;] ׫uSsoo|w!=nKۧJw^onHkYorF6Sw| ׅΘ맧}u7e=}b'~?O2iC׺C^#aDVqG umGǗqeġA S.xet2B{ƫ:':0peǵlژ]p 8Rq8>͍rX?^mg9"=?J2yhztzw'|Hzd4@V3ɹQfʩou3WCn:X']'g{Z#[vmufO=uU Wb2huh8JIJ9bpu‹9`#U++*C)Ɓ5q`k@8 Ka$M 8P ՀA({vsm^'CX:Ur칰h+nsǯZٟV}bU$ۋ(o P#Q6{5+`^_m𙎴2$4d5@.( +>J6UhNz0GE=9Fլ͒Lb,𝚥bdQg]f>vEU✠T]Iyh(::qIqtON)fh(: *Sdc t,E ICGٮNWQF!mHP]vu栾lQ0BGXbz kܨax֖4&pjI40`HkZHH&h]\k ײ֟IͤkSbcsNJ3i%~* U4p#Mi/H5+CkW&Y 釋{Qn(\^ tZ0VBOP6Ds3 G$)徔2O\~6W.M(rN7̀ vf7[8ܮ'*r#d34y 5CP Jrpfҩ%R785*O-ulEJIcNF5b֏xnB-,9=oq`/wQ1ƾAYF8 y;NIC265JFx5:nqpqFǫTmEaFv6g2n+r02C k4 |{.oq|g8;㿭3̊Oqw}wd' g% o(5o'?¡('Vr8g19X8'~ʼns!eSƳx vbjOP8g6x{"9o{=o Ɲv3ށw}?;up3:mtg=n?t^Q"ۻpg ٻm7OӔ9OQ-MSdNH#!/GWyؿիwgSoʙ}f<|b\⫔KDb7T=6g_d]ѪǐVG+.?wOP&&{B(߼/Xx^)7KY+8VdPR,R|N¥kwOe'9r;ނYLeqSSO7qAu|JïP \@c/yc(Sx8pXx#'Σׅ0Sne]i1it*Us伅O;^J=wF1*dU}=;bO5wyw}zJ$eǟ/T[vJџGdD5.Ϸ~*;ŻX#;Ԥ<9 g[86`|}{5UPŘ0{kWeyr1aά(n3cC]t]7:ĥ:qb! "ާ?<>x89]?Ud?˷Tw?1{3\QԺVFB(-qL5e%[(T# zkYt(:z//oߎoS?={N"Y̫CI5g1Ð Fv"CLefZqSWTp+{,}1ע w%QvIO//OET7g&  ^SQa@7rҳBIlMXٗBuZ[DFj8ڜA]{,!MyD;|Es .2\l0C-cbTjD")jrUCKZ3T4QTgZ5`# 3K|SZE??Ͽ|}w۴M%–o}tW8={}׏!&jhc?*kvG7/7BɧW7˗ؓޚB,^]ǖCw`ESm(.nwTrǖQOU~F$q0i'wBqB?UOv:{; -hlZ|CO[卮ݖ{ϴؤ1R,>\h+.lPx6XR,T*}UVZ{kp ̜_Xط6JCHMjDC..6㊴I4RCn6%ٖ Q\f#2SܗҼ bX]=D'-P^^kG.|O4}/kƵJ`ȡW @C[l&WqRTuYH(D &IbZ :NLz5`J)+9mZ%bS^y ̤xNs|]5TCl }e C^4EvarsMZHbdL>Jl SmS$N'_$dܥmC,!MҀTmYs3 T@]"Dkai[N$!{yE3i,E/^$Ko?Wjpj2Pݝ+QqV'eMZSO 6ihѢ5VCe:Fb[>E$>XE}=x|WdMeOj &Nv6b0*2) `R!U!ĉpo߄F&Nh;f rLjgE7q0AH!xvIyM@F_ u9D?:Q#Zw^̉1`anQ|Ț|[+~ O/0u[uzz;E3GC`<ON@jh-M9,@-mGִoH!~(4Iwү 9@ƌ|/JTYgü)y7Z8w0J} 4(CclHeɸ woÕ}Wx"S7-25AT;S-~Alg@Ӄr#0RsyGۑ5:[C-Ƒ)~.QG#nBp;${Zвus #1$A֩9by؄]J)4l#kYQ 0@ٓG0+ W17܃\w54њ~'F95dhl\q株y6u鷣;c mdk떜&@rE܏jƦd"f֊Rljf%c*KQFeHH r1&ujMf!mNҗ1_ Ν66 }xQȧ<5kdzõ/d%(YpR gVj&%ꪾ"6իcj2\2R21n9]P}=F|mxh@ 1٨For_&{%ƕ 5)S(RNR3BLV!b"6"ax 7Jp*R4{,K͒RtքŴ`7=gÞOr Q*BJbn)yVwg,c߈2U,h w/|PGz؆XN'NPC֭OiA3;Bh1ֺ%,g& ̙3 [Azw&YBѯ,)HLP1Ey5n$vO4Gzu#&Tv@l43$K0B0}L`?QAGmCְ#6YGJњ_0[fhҢ;:>SolhSɓ,}Z+ U'5Ն o$P'֭t褓z<}&}-ؿdBF<;t^NuCDkZ1 ~T' }d-nN('ƒg @kːY`Z]' )eAeT؎œH*[ umv3Zv6$0夎%?vҲ:id@=8 ukzm6y95:aP f-᳛YIT,>mIPǑ.ҭdhG˜m`!Pk|mݚFfK2H,eSF _9|)b6/Zy5&4 :/ *gH^ !3ܖ!_p[fȱQEkch3CQlh iهD Gc[IQdT)Muj; ִbJ`e.>1PHІ)`V]l͔/Y3Hxmt`JǞHZ?6p چfvCHZkE`bƱe,*mNc+dhl\ 7f"h͎/[3<"1T~tbFnH' @ pum \F3/0&ьfSDtÌڜ H뱘֔-ZGWgCmPֽّwMtJH'3͹ )mF:WL"FgQ =lq^b4 {' !L=MeЩq3LXn@7f[9r~flτpJj߲(\1e?dҭ˻JN<]dÑ48&)WW潿-%qqfO3a>\$"]l, VÎ]kuGAޤP`U{*=/s!f*@d[+,r$aDBQZ8#@H,ߪ~haVYZ=  QqNכفs\hc|y^mjAXD,S,XN+T.f,tv۞Y kA#1q Kah1ܮY-f"+oO c"OiB:h(pkԫU L> ;|Vy9ӂDUR-[)f I>rşXcr .K5quoO_VsLU 4׫/tȯ*_l~"@|.,?sI%weВ ɯ">^(> 8P45)/"cāPªܲP~d0T3S<<^3%)i ҼEd8X-NFvNc Ne&2,=tlq4 ߑֵ3KsY'Xh];9Sh\"Q; uTNng4qL 4NBOۇ᷄e1W:, o6Cl-A'O-"?JӟondY\uV^ 2Xo~ w n;zK@|qw{.I[~浪?}XbyUj(zk]+>`f!^uYX9v3%9z@aLB*0HH/C4uNji#ӓ]>uei"T ёfo.wqrD{NY&:s:aDzNK2%̒D37S{[ʴ QB&f*Y2Kȭ/rRu-L^vˀM(ue-a$0sm߼b履sE΄d Md n}k-9%h=#B2@,$5%Z,N{lU?4&GE]HaT2ޕL&a䱄OIZW&.6B)?ˡh(eC%y) C6v2|Mnϔ0Vkc ?2tA/jcw n(}Iq|^JHRF yRyD RUy1(}Gh):\|䏯7(41mB#%#O0޵KoCu"@cg`nf b SFi󔂱+FXoJ>~}›,xIeQPdeg"f. b2˪YLВU LyK(#K0`rS+re"y]7gu)(2juۺ=RLD$(3$7dY,lm}*:AP-^}ʂ fxP?_ wcC0@0vs@HC#6u'5Z‰Q zLW#&Ѡ>󭃟l ]%ň{GYqs=ɯ癗6+2((3b$KDKd{l!+XҵX-b̧ [23%Ic?0v!6VZ,ͥD0 PVMD;⭨x=_~7N$WG~OyP2u ĈEwYz<輸 Zt{ӯBC3s81>ްFoQoq_TiM8?z.Ο G SܮFўA `g+p&5g˳Ǡ`(N6c²sjYOspy}~ىrR~*ÔsWgIAh>0+e8DʎU%vo m/uxp0hT=1RxХ /1S8fʴ,nk^^Tz;K#3 O#A㠴]Vedtʍw>@W7챠$E_*,e&sV'ElK{{X;GrA paX~* {L.z]4 k/e䗡2_2jyuzeL(PE DyZe`*|2B딊LdYݮuu9WK(GlA]_5_%ku #rTθ8+P 6@o6w hF98,W48H=3`}+j~uLښ%l0 J&=y֋<>*R&qj6 d% RQY& {wblPk*IJ}"E=&Em՘7Ƴ'Ym~]Dd"ꓢ}Or%/^l,=!h zRgWIP}^'Ea2 Mv̑U[湛WvIqU/M^~JZٲV&hRMHkSI[*i`Uمw#L6T(@bB9$UO 0O;o!~IZ܏¼lgcP̙߮\goj7HͶb#{FglB6ާB;hɢ4'"t>T XIA$dIMɸ11RB xȤ+=9R Gԓ3OI/L.V} Iʳrk@Y,uM2$5y98NH{5>@j 1mr8|@ϯn9J~D,~y rg`$<ȹo17S*du&,+&$t5.--wX(4؅1|`4c18ޚ<;S!O"LLP`"4Am.EHD⤘@qE XN.Z(\gJx6_P?p4oP-0x,-ֺ"JN\#%uPdQ)KZԥ%wٙ>wznXC^?r ]I훐e.K%0}3L:7(a۵aL.B| [y{㑲{fLX&ߖ̉P`L *psl0Vq\BO2 @u ϙ=\PJW14'! 5s!ᆒzbVR")~lV*(ݮ${h*$NKb0*L]U;@/U$pg*A*1{+#f]L֥\#&֋^mm͉]$´(;4K)*҂'DoTQ#sB+*:!:!tr埮k-o+ FLʛUSy0BG-!O$qs??LF4eN\f#~w!MoR}'|_춂.&{=tK!=]pٗ j6hBݒ=bɦJr{`s,.Is-\lW5C+u^rЙ$>\I#3ZUJ0D9Y8H!W:,LO Z  XJa!Of ќTc̈́8tcŏ\|-V$`|C(i}a r9ADIOTLߋ̏ve9,CH*=lBB C|Wµ۩{``cŨZW]O4ȏ Uh*sq9Rn.͏'Y&K9|4B77z %噭k{'<ôk~g_rC ()9q愮HH=_ ,cvC0bH7;E\Ђf\嘐5hr $ 뙜]7.wd€0,-MP^&~!ğ+H_҆w]ߤZh# LQurcЍtR}sjk5L1qjOqjDnuKq1e)/Nxdu&^@HgeLZE: !$߀rRơ8 $ :'չ_ă&pbș&X/$BSrFlq yRm1A;mm! ZIOl=Q\ܶ{5k$R鄩~~ SݍL9gR=]]a 55`?j%$M =깒O{v\{?&wV/ұXiN ٴJp[gȹ<kU~?R $z K+PM4mnNzȃ;G'GB+۰#r'fr1^ΡOqgxoD 3/Zm4ĹHy}0k"WONvZoÆ$c [ :RK@֙2O!?ٌQ]`r8bӻޜT:ő9C*TYI?ڡ?Nr"n0Aˉ0EPKԖ³"L;?/JR$wjAV uTXX E*LjqۼX7p/>t૸к$|n 'Lx(>J`ߞ\)C\NKW\wp5͠.1"d9Ȣ1cϞ=맍{a\U @R 9ĐxAHjj` \p5b› ͍_uhjLRZ|0h>̹a8Hr+!kqB LBI,JʵYMu<, 8˜޸\}6p|0@n jJ*$jmeig$a(~g,+?r9q%piUAƚofg!A"4vlI([>:TiA3}pN% pO~.VxJ\KPD#1<'b*2kNڿM QyԨ$$c$  r%ۋoW̍iG= _WzG] SB}jNWn߅'0& 'Pו.7Rq  *-=}%G^4j,g"Cǜ wY4jN#'phbn=+%i†]҈q R^q=8u9y&2lvBAU%VLX{ ^ӫWN`*'n>;ۨ}UIXoJUϟT? ^ N̉^j@k;8ۋ+50PLHDBshsU@?S} ̭~2 ]/9 . 6lπb嶁DS0> תAeSԎIӓRy1:wJк?5} (9ziA_7*WkF^-U:뿖w9&8n}tڻ}Gw }G0^uף w{w%?;8mLCo`˟aQ(k/Koӳ?N|_ޙt룫w ߝE?yvŖ|[lQ˶wt?qgIuDm2?m=t9J\F}̙Nz{I[E^[Ћ3 %uN`BwO[̡)Rm҈²&ȁY{eP8;ncw2Ⱥ9㗓v/#!ڟdԏkdO7legH4}, 7_L @֟<*$f߆vTKUO`q^ԌH_wA`+hu5xyegQ{k ڧoׯ;[|썟m&fױ6'h'z OO#jq hiu{^z^dWA 2zRN kI2;[9(noѿ/d&d&ILa!Cc6U?+Q;1{'s9HM\?Qx+'}3nh+f(!VAdz|ϠS^hUIK9Q V8VR?Vy𱚘E3Z3},@I޾:aQ̮ͣ<'< -abW*SǏ[Id:TӌwgמyQiel54yu3HqM˴_c3~ WnouktW@O:2py8{ޝXf xQ-BK0|T6WQ;ߕ7:oQSq^գ+7vMKm 8@ v#?h }“8BO;_x0N7>v }.O:J BD>da*7bϨucĬ+ҙ,6-s2H#LYf=f ⷲM`j&I.z?6HXR>^"G͋5?28N5B7'/KR3}6,uF Oj}x~ :|Igz6$nPu &WeA]|^|{<)_Gݞ8pqF $і#Vyc+\ b *#`fЌkF'  s?B\5 !ɥ n k7V  xF弰'ٰo }z0"EsH\䋴P92 /Γu&́wwzwkzޕmt"Cn Ipv[hkFTJP %YPb~#y,癙3Ӱ߬U2OB9qbFxݏe^|?؝ħA$?M"rBH d[(RN&\"W[ZqՊ./̎Q:pχ.dRavL"WBηo9}ʈ0tpL|G {%u1N%3F^)2UNTMn/(w2_E %xQ+2U O0,)b/~BJo@2,xB_02U(fT(JUdviHKEZvNZB+ ;2һq Pa`Q3GXIu_x2 e) swt*%`m4fQ15qtB׹6 뒤? ݞRXn'֎ Wma rTrO, wc9[{rf=>!v]O3E}k,xT8cnmlwW^$Is:dɢdW:;n'n Bc+ְFD6D6[tk5aÅ mLΙ!¬ChA@2d*cA!uVWs$~9GMXOI|kmvzF;'u$ l毼4I96T/#xҏc qj~>N P"d53x=$|3zx)f{J #S_sH&#F1R!E!eФQ*coUO𾛁W6 wt 2ͫZL#y Vnøs[6L(sn4(к0?s[6}H PEslPK`PZbc!n7=2 ~Ub?ӟ:#awK%#|:/b:!}< Ǘ~tlLIxF8 [MnGtb&f=XQ]cA0ߑn :wѳ:{%zƪ;гPw^ﴈ3JttyHЋ86kP=ܤ=0^BNRSƸ~ٵM!6m%ݯZ߸X{`x?3!ci3 I]38ߓڕݍ\PzE -gշQm%)i5}w>.2Q٩FR.6w;0JhNct2L9cY[13Ƶ8y5 lGUFk7`o^vOWoYwҎuC ؒ=o]?R^Ǿ,"$ F(bScC8 a> %"I,yuF7fwS$>- 4nӰTKR*6s xt2?:_\^^k: V? 6R|祼O$N]Ʋ޴Cӝ  R4$[ڔnWձhm g JF8)lXKeXt(=O8[ Pk30A$/dҕmEڅ4g"p"?"o.2UvYYK ^cGI+%PgiF͗b$PZ*eVPS$eė @_ c}Gk/\'<[YnFHvFl?PrFG*xsħUhR,^Sxe5x0cYwB:fUWzTdDe!,Tomf]5~Yp\<ϯ0 MKRK5Ҳ\Jj\+ZC5b XiCC5-W@vm M{|cK{!ܡ/;ǽ2VN1 oF;ƻbbW6ZJfakJ`MǴ cҪs;T6_9e 6_˨cq;om%"J\ k-J[oI\Hf|wKTA$⬜4wb2}#KOi " k#HYaAeKXJiFMWe"bt{r2+`? رFƩ*4%hWL2awf)GmNED #&RcHs4Ř(xI/tC{gu"EXى쮚j4"kR2TA$t' 4 W%-V1ZUa_09=`^PH'K[Q) =LRa\ ꊔ4*Bƙ ko|JϦWRS]ƌ) -$>y)ag8 z+e%?G{i<ZQbR5JC}N3Y3Ssg8%˳yAr s#87W{u9BY,΂B21Ѝ<cV'M9Ku;: W&m2 X [dbWu{P6ǔ }:{` &x*i+P$7!fm50[{$Ј:ikم8sd[ڮ6 9AsWl^{/'B y&m\ho7kENv19Ad-ȉA6+;DX){z5eP΂ UكCLPUk*/oi> 8h9z\xLJh;Jg + dr[g_xQj"i+xpwuZz7NOTz*tHOġi2ߕڧְ$ *i]ޓ*I)i 3 ]5RR$MIkKSSKkہ7uoB?|\]FAxgG k4 w3q1Q0fd|妰F{g 4&xKK_.ZbWuaTpXw Tk.Y|ދF'd}^ Jg]8k.Z9%tN ;o{S8B#!Gq?ˋ7?KIG9z,[BB&"]XH ި"= &[f/^Jļ N(I c Iu a=)xq) Y׿oȻ oK+dϣwC1|iy44ƚ@2#><{ž~E[A'G}P"su9ۥ9B-tc4i$c "LGgBBPNUB! }?7厧; m{|O)|<4v Bw.4c)cCO=hF<`8!Y(ی!PH|`cF4+mʞF$֪"H 1yH<IRQ"Ld$zALv1xdR3DU!R.0)Ĵ<=눩@堹C1&t\ BL++͕bb;*6sH <*7b6n{ H@2`"@[:xaTŘT=ة7l{q-t8K>?'Υ+5gތ?g m\u~ټx~(4Ӵ7.n?5\Ԙ *P*ݠvo__٠#PLKSċR޸|^q=Ȥ.[wtg#32uklFm6ne0rߓꓛo'2&a9qPw׎f5(!)q3maI}EN=zo}s6ALa.e$~侰=ؿom;IÃq^RY`r~ȊцD>Vsn&h>\˦ɢ ʾ1^:-(]޸WQꆣgۦ9]0 &h6JZ t2A#S K\_!}Q3i:怱uR[{7RR{>gPp^BwCq<^%/0¬]\:Y!o ݗ!dyB({u/m53H3qfkgeQޛ;L*ۧ?Z tz.- 3Y^gyvE G2zg]7~{Bmw3`7 oo;/`|ducM:ӣAf؛f 4{~0{I:e- 귾R ]``bق 3X27?F;x)Pq:nʙ{'1)Ϯ%ƕԭvΫa5[XSZܚ_dZQ)H pQnz4iWpvw2==|J3pOcLcUdBԐ5nj6Hy{ cM@}Ҡ9\x0լ=b cÓ>4hQb_xќќќ10G`0_Y$"c\ɐ Ԉž/BLy ?~}kd}^qho[j5X Ab{l3Rg-mFLmmm[gEٻ8rWWQ()@>.wş &^v珪nRW:6 ZzH%i]iZ}-~׺}TB6DB"#V r& PI5QMB}({'P4 鐻YH4 4MQq6wtc0A 6Z.|cD{$:[iB2Gwśdv`&C}:?8j''*ji݋p5E6[,s;$R>qqR!"K}u@45TZ Ԯ*ɬZVIj<ޜjthQ㖖}IRAn-x'YFizP""=1",xr8Hveplܣ=& M0FXFMX̌gvOE]c\\bߧh.ٛvNf&mi(Xo@uZҚMc5yGS#MM:@V3Ѥ=vQv' ;MMs S3L6f3 S#BvvQ 6;x;5vSs֘[Ye٩qHHYl=I AiF9Ľ!i\MvmvUN so)<$F]b=8]k>1w䍑gC\85W7<,xRHcP7Q+hٵ/.Xc~]f {p;N aqQGrkkat ^W֤hb\}%w8ȹ.P91< pĎZO@Jmt#Nԁ[rq9Jfzejn>8{5-l%6t∰*5v5xBFD0h'Іz]X1ÆIe]+` 8-K SˀY\!$okIyV,k@\L>%ӹփF[*`U[nw%綎ɤ=5(H+ .ݾzZq> MR^DJ6J؄vuBYřAC.&Jy)e!+sSHB*il)AeO JtrM;AHJbrS`lӒ<}ۡ$y`W=y'8cᠶZ|hdL (m"IA>N}GO!8u[,)ohµ99TJuﮌ}V+;uwC4j ނ k݆7'k#oq}\~jq-K;d̾}*{^\^|L<~!kTp9eÔ,ߐj!zϨr3 w ̂o~u[se?_ڪr˲86xi;yu{.jbױ|h^QbK#"wmUlW7Qr570vTXA7͆)p#'[я5SH}ZBV[nfxiՇiű60U9G{/?#NvZr a"eT.˛a滓>A&r\ 5tu?쥅=xcUL>{:7gF!N_y|P#;4[4uhl&~|h-E]P)#i*db]qu8H'5s x؍vWZ ok)`+V5}ߞXjwo~-jSыMDF[`\5=0_5]Jˣ8~0II[xY[:0ߨ~:.v|KX?/ueCۀR"#VK3:_/·.w~8_&DUw&g5w9׭Аx0ަxbTV&V=k l{l?_zʨrRS~~xsK]rc>5l9No얯Q-ȩ5:j>UeG?{<VN)m+ku/$g3S*_O؛}okxB˳+G%ڟot]I޲qml=k y|)qON?'DqR8㋦Ӂc "BdkV5:CvmRʘ尝""w[;9 ²v]9_~Rz?YZ]iRt@FSlY{gƋ?߾V-Y>WaGjCkNjMիسʯ&̖HW]I{ L)51YD=K7V]B;y߿_xXJ^u<]E# jٚ͢MOu,6}_`Innd"7F.;GeQv|eYWncljSsoXX { n_(k8wFw;|&eSӦ{8B#a}ڇpzs?x5x!$LU?A`k H`^mZ$#4f/,IҨSd[P[q wr!9"=sF8aFD,""Mm)"8)RmRTՔkX[ux Fa+(Ȝ֫hj]dtu @/O؇^w<7s<Ij5O UTl L,( ^TA,JTnfWY:Z||s꓿~uv^nF^{Vގë-ڊ5^E$$ʤtYV2n9q KLe"4B F;f39_$*#c3K-L$2S VB cBk.&-[my]ūjK`O'!]jXD2 v k=& {jXдfBnЧNJOP_4;-T̥s Ԑbo>L-*E8L<ޭoS.b1Ԩqdz1:Dg":NȪLP؈ej)X" Lеp( a "Aw Nb6FEyVΧyIu%H)8[0&=g_KFF\lSdlE18H"};b%YB&怂9=95Vvk3魾)ib]RXuҝu,[!m6iwA b}>BsJd FF xg]9,8vOPSjߗĐ(8+9Mj; 9ϥ괍=CQ ~ IghUҵxrP{khj ӳ46øڴgҎv3 .1mu=usa@M=ZhbeJub{9ؖrz/g}/Q wynuF낑xތn:.4Zˢ@ ˬMHb z#adg5l oNvN'ȐS3k8GeaV΃'Y_키QmF:OUsDp c(y|ygMcijDO0@LB|x&၉M1^@B~=d żl'pR"Q}a$Clr)$G|;r]~},F m+G/8y)(hrRTE&t\fj) BqBP "1ք`s 5%g%O?? ~fKgk93~psm_+wgi)2_zZ:(cNH  )cJey \]"W(JDa{ncݬCg=f ٽ_|עal>zqv?YDƧP/L`R~|^g^l˔mzM/٦u&1 f%0ʄd{K (jMXc`0َeCE2]2CݨO(:l'8AbJof_02ٗ)̾Gq"BlJzzjfA+"x $J)IbJ?TJ?%)㦸<:ۼ<(>3lWϊKaJzd&b*~'v"pE|w$*9hx|3:Q)G-ɡ5Q].R& 8t_M~@PLScWRq_YĤ2xDu*kՒOs7t eYxUڿ gq(ɎClfG3Oz#X3//=mVP5Rِ9KVoD@^?ۃp!F<y8㟏?EL _7gߌ;~>{8 qer ]nk&W/ozZN<! ^mۓJBU7"x%sA޽X=&4HDmz^|ȟ,]iO8gv'[/f3oُ7_nݿV/kK| #,fa xϨQ)AC*e >)9W 2,'lGBpD98G0-!ᙐ z3(}Gk85$C8k Sg>@x}k|&?rQgKxx~.^]Mo5XxUɼ⚡V[={/j$7g(MY*.Gz2%@{T>n*!%ɤ<ퟞ^EMjC;dqrYk2pIYD?>dƒȟ:4N׺I+ӗ-/o*:~-~郏vjvd$jq)s,*={a-҂';ZIk-%[̩2h_}% tNjvun郞uF]-VѻA;&Xα4M~kQ1¤F&zY&q&G+|~i4rnƄ!礖*>x-X'm] i}<syOtA"p'wF#)#]lSx8+/BÓ\vzztVdĮ#l}{'c|bĀodt(ؠWW8.&^f5\nEkS!6XXGqFRsF36\CFWNgGTWW8&t%ڂtL1c'z1lS ȣA!esBZf  X8*aEZJ"f{xu%Z_`͖\̃ÒXr<^$mTE-fp.`E"f-N)goY o΀-8K. 8iy[8N= oMi FPy`^iIoǿtsgAvh{tDtVor0]d=It AKjDJue a=~K;DfcYk{:V$ڍ@, Z7V*|B(] qCC!(>Wv԰~5j^`7\pO5fPp4k`s<{;A35xA@+_>~˷Yh7|ccx6ȶystTP0 IHG:HfvPVO1"%.ǖ.3(\4P%!Vn>[2dIbϩ-nKLF?#"=,%C;Rl(Yʨ.h]՚1wT\W)=,"')`Q=,eoc)=j%uau8s^Us:i-@ÙPzMş6;xU1X6͵|Uck"ǵMU6{{$+$bvLtoE%H5(?jAiCm(ڝػ7r$WX`R 3X lgz`&I[urw})UUJJIL!V>J{AF9û ϕ5p9#Z~HXNEL+PWp缲 ox zӻըU fW0{u 2o8/Zc-Ci Yd̶"Jc 5]Z ry hiJ빪I>2Jcbqі[[%Cߴ 6x-ʽ2CLU[4;&+%&+,hHdU1利hXMbP\7׃D,vI:lbǸ-~Om'.No9Ż|_8գGRr_=/BB<}}O>)co_]mgBLg&5B ?Y߈\-p{GdC2@u)F1ipv5΋o=aDSJᶙ\ jmO~: _/5Ư|Ҩu@\9pL*+wqȶ<ьםKTly#jmy\Ht8b*g^]7X1`(۟&f8>ifz],E: Ub/J@F7}/t e:V/ 9]Yڵn_|D( DD͘*&ox[䵏V_ݦRJx͍=v iT3lj Slp2fr8Oi# syE.n`Xly"l6,U)"Z RC)R.Za9jM|܊Sۥe:ma99j~|. U iQ5#(T:쑶,GA{ɞU)e%K ^G\3WhPل2GL^YE - ic^X~,!;nHPÑӸL%ΑN9I:;Onp1)%%5 7󯙛Lغ?72@/L7hHD̖g7Ij@dm^|ĎUDtxq2!bnGkqj,ߡcD+]7]ֈG/i#e/h @q0'FҐ m7}1;GQD~n>ӆt\lk͕}ǘ^&t%#{n["̜aHAKŶԢ+)ۿQ1ot;u[Uh [D] o~:(b:!|d{Mʽ_n{sVk=V-C﷐ܘj$[@:BEc¤Pxo=$|Ȉp7b.s`4g<ڟ}hVPjPYф0sX+Λ&9N,!!1T3ϋ0p8TZY[LyIF2gד?]ԫO21z}1z}^.w8+2"0), q:e WKO3&J!|AD3䏗D0)s>Dz0n˧a^>s4+m2<?\Vr('zw>ܔN4t2SBAeeEq&jzoʉRJzo[kޛ'\V"-NcC."Ivbc4i~@RJeTFO~t7\9Bp&TRq/dyEXa7 vՋ/vzX>E;X>{'8x6$Mѽ Y(,JDo'l#Tޘ4R3Ο%qx).q^[g ȕ̓e(]-Tb>R4t1 D7Yͥ%8Wm 745șvR78\1q1vW/p)tXZeEIT'}IR&҅Њ;M/NcLcgtK:mXHR# :o]jnQv(. [&-E¾s/p"EճMyRW9I|i3\^qi8IXRǒ~5si5ǒ-g3WZwkPK3XAO3qߠB[qʐ鯷k3Vn'0{KYnӵ庻0ҴT[kn^H?|=pdAg`seq/+5eN1ȹF µui\0W.>-;l_id\&2S#7.3Y" 8!erEk!T9e8Cs'GEY`B0 ɨD==D|Q}g~QS:qҢ:ƥ!Xu.D.^ J8݊q\oyӺ -U&S5?nD'h5@ǂ2\%1 7[+豌JNIP2' X8WQ:T* Z6#H5{mY[T 0JRB-v &(O6uD[HeMlBITPټhpml[@ N+#u7UlqamKS̔SإxQv&qTzjwmG{.aI ]t4uOmt+==ݶى߉l89]bҜt5ti=gzZ&[kUɵB(R9Kv-gq_[>=:W1uL)E4)E݀{!]f v, qM֎ uv=gr;w$3SF9y鈃ꄟnQLϛYX d?7{3BN*IkPe߿NeuNvWs/cUs/yÚjl%E'qJ$PSk(tNm ӵ h ݅ Raak!'q+ZpҌXjK b4%/h=tW iV=,jHW~F7U`kv?[f\h'=ŧOߝSޢKOyo蔷Ff:uV<;*W 1D=.Z$i'-Re:T o/n^;]x/>@]ђQIގP )(h=٤|4pHi{?=)r.ʈY(ae2Pb jϵR`$t 2r^]ܛ3r¬<"-M >Uw\PV*eytoDT+٥oo?$×]ug݃tOXݢ}o/UK^:L_7!O?&6%]HBo(-T.^-2BtM v:~S쫖Gn3pk[?h~3~8-DvMf/ u;q?o/linS?XI'솮&W^Z+]___=$[6Fr_ 5)W2*))\])WkƸx7r#l>1?;7( En:U|2]0F_~iI3S1Wv:B=B_(Nt (a"]oOƗC'-%Gף.AnK'톻 _DH?h~c#]ЗΌm}R>n.o1҅h@( fǓ^ B))jpS`8 n4RFHЋ s :JX,1YU B4%, ZH'˯?,ub7YI0_z{sMt3봠z 1CEf-&e2@E V3PRb^WkƸWLUɨ+u5:+bIʂ: c/@oo\8wSWI)<,?9"2#)=<έ1e:hCSZ\$Mt\6-f+{E%bp< #GꑐЯGQYw!F(:5aAܜR$s<E>j_J`Í'6v+98v)k%Z[3Pu:5vv ud.:aPX_ah#GfTk^hσ"hpm%ջ %nQ=i)5}4hǔs{20rydjV';]d:0lf#1;D7[4_?Nz(W3GW-K_>噏]|=p3uYGC4s8bˣ'^?i{9 d\Ǫ5UMZڔ^٢טv[ETHYaן:*̚1U=vOrd\*>2pS誈TbV 5cܫguHN +3Y+GIoH-[N}曅x|IN/(Aف=jv˗W7u\6Σ4hu9M16k8>ڀmaۋmC1wLjtk;uÑ.h8[g岫֌quVܮ[0JBQ![-4B.o~)wU(<{fiyFz%*訶K|x7&RXɰϞdHAwixзQN.S6/ߍ܁t%t<}V1;"s1\Z~9™!RpdqtNp>Ҁ4r0|҈7HDw`ߙkۉ':rJ=o҆+ȴ~|Sƍ6jEO/ބǽJRɽ7,8vgLw/}`Ќ=+UDz);JÑaMR?<ܷҁ;?MZ@)?= Ǣ0* 2&W$u(%@Kl- jqq֪5jW^֌{UC+}C/Q(8mTXX'}I0 WIKVO%vjG 'FVEen*[(CRLl?,2dV"䫿U/HJUB9;j3{ABMUU_wBh&@;oNПAOg/bzO ك߬?}>Yz䣞5g|Ի7pLB˦4n&`rYSH.hc6b픳fE\%pha 胤PUϚ,9T^XL۲0>46g)i {κ6-0g0kGΘʷڧTߦE^$[ ҪOҨHUYIS#&TP3E \6(=Ws',V-Űyg `/1`5gAiP R0'|CNZh-w8Rlr o'dA\U5h "S<<;*1:-O7dJrTo tncc5E O V! ~E Rrd TxUCH̵S^4M־{ȏE?!Q;l\xjFEm2h2#AbLo`1L`JȌ[(H̨H5ӁZ'z행}VcIsSfFR% $2ϖ2'D !z+dL)@GU$9vܭ/&[<5"6-0@[Zo;5oFH)lR*E! PHڙz^>ŭJn.dpP Z ֘ 2xQe@T>iuV^ς)!dx&FY)(y&V#ݚ^.A% \ xǕQƯhAmi^ S x2zo*vVm`1< BeTrxJ!HY&)oM-d5=ݗ)WwLr^:`$tRt@Ĭ\&Sjemj&LJ`#?*S~ɃU1ސt%8M8>VJ"iȇ9ς9i0W `ٻq$WTG/ʇ${uu;dnHtز#d_CeZ, +5'D6ht b#%gBDx5`\KTJF Ștap93j0 ;LjR*}?S N!dRWsFq};_&\XUK?3n W|\nGn\Wf}]͹vK9AJDybڇYy,N!Pŏp=ͦ77fu[oYӴg}5~z>lW '\]Ngpk^A D*#rV߼e*4ZWƲZbRX9 鵽:斲"wC<5vVOXдI,N61)\e䫳ңw=4yE^(|T+M-3JXKJ!`3V 9`˪k0j`Em@'vugwOM-*Nq1T!Yr&2 Kf6:s/}cNa0u0pR`ăDrnL'r 1\qYg ]Z'H*:NFsW.Od:Qt07΍YG7#/:fH9֔6[%<U}Qj«y czwz-Hgc{qy!ǶOar<_@k .(~ԉ~B Pocm&9 T2fn#joEo{x3܌ww|1޻[j1߻D!ƺړ/lãWAOJu7){oޓrL{2}NʵLJ,DIWRv&%ŚvR6'uZ' X r'z+'Tm6_lLrg k 5k4dk/8~|6:ʫPGX_Aǃl@{i@P)Lt~7pտ+h" &*=\6@PŻ;!*Uڙj+0hQpZӳӨpcdSΨ#$#B"ISh>K}hܧVi twWrx;?T"6tjSWwCWASp6*M_ ^MR~֝Y-F~mu#b(Ȏ;iuukf=0*V;FoYZwnV 1W)KAĸ-p;:Qm2/eP9`@pB:K5Ɠ*Hū[e 01vpB:'1eXf5V*cx.stѨ< :/qwp:'RBbͨls V+թ{jxףyٱy by. Fl~3D- q+{ԹD$f :~a=KE G)$j~HЄDqwpB:'0eo nn?(Өh@%up:? $폜N#ţS-p:Ό 3q^THT4N 3bqb[DHjwu~t=xSޅt9¬;c?]ο?Uj9/$Ȕh akTA;#w1:,Fxz΅zܬsvyTH9`f~z&oy[7?HڼPoviG"tp>qk[ xQ cN̨z)=špO\j2Ψ&qq1*โ L]d,H){ uFP9NmNֆJ5*'ȑ<ӨgXEFõAIxc rQXXSƊI fV kҺcȋ`6˱F\Q+0j k7_VCq:(Ĥ=]]CL(?t8.۟ޗ.%y^;,x}e~w;~\ٍMI>׽Wg/W/\ϽVw%yy9lZ3,_J(sQ x)\r"0QFLQՍi+Z?ͬ;4^Wq~aPHP GI@Dqk<((Sp}:YIjRa.HFȍVyjPBpq%ƤH~A5pS\0\(#a22a`)ʤԌKrem-`"T: rVx2R0i  )QE2JɱUX, BXI(eF^\K;3/s@®0ryj֎ZnZO>\f+ w_s0AS-yBHL׏xBFߦV3G$$D!5I1/ ~.%ǘoO_ #ΑFqN&WbM~S-VJ;kln6gU?8P&{>ǂi.ggo||{x \s-'mcvxH/@օ@p5p'ӣՅh(G!dI  X %TUH192)MFUH", C9SiR0RE3if:JS*Ԥ0k A(@`L)nLBbrahVȸ;e< L aTF nQ*1bEin}!m3`PT1q`Ne{aO9۬w`ʐ 0{܆hEFa 潦L5('0eeZW'0t:_VrQ2)aYJHհ6f$ LWy?ΥAIn)! `50JH"8bpɤDJQIRANC كp&V=L! 1LY-s)9@߳! *X=\һ5Hp|&S̐]x3rFJhg7ojm\"/۪OvWF!J%u:Q " +ҾݿV=竳RBp^\,v;o0_]Nbz9֋TN&2҅rPS l|2 NZHfn:a+ҵ-PW Jy5TPjUBJ 6TPT*'E~WeY0D(Qv.EZ.fTSW)RUlF)JT .ڡ hVeBmauHR!tf嬂V Y{,kW҃%˻=ʾD^ MuAJ7kA97^߆$ޯ(J:>BDs9UsV\YɕB{%Q-@;wrN*EvRo'7K>{N9k.~ Qχ_T3?gOLK)1s*5P&rjK[X>uɣWdf"_,u C2Ppyb$AteۅxCF`Y/|v1d9>x%X7W𕟖V:Y,7  :ь5_Wv;2蘌Nj| bv0zʱIjZ"CFKM-EnARAQ?ds~Y:T`&LEM:9㲻R \R֛Y~jY^}KAhd=FExHOT)S6 ?Ӌ mrNO"3?UEÒ:on]q qRCjӒQ z|&'g)W׷i*"''i9bDyn_q"gDQXʔ[[LʹD\ҜIJsjYJe<~!JyjDݫ T 4^ܻ!(C5aRWsWLKꥼ)+A:-.Vyps%tи|qIfK[Gß${uzSN^ f>IXF#q],k-XN*1Sb&ty%ͪk(sxq0`.ʻ] _wNmߦ O99ZUflve77^>^}3Ӗ]X:fW{5:T:k\$MEdh,6FD3$jpVo\AS\&"B(!1Au n&"D:IT#rrXE.HEVGLnlc};PyU|%`#`wu* I8SiSNd`Bq̳/Ƌ/AwQ,؋ʬ1h*s>MVTl>zSܭ豣eBNjx*#E2.pzp y;3 1 %#0Ğk#haJ(x)DL$K2E2 e p Ū4}}CWcԟ, yh$'["s\jwA$py/j\r'>}T3>"IHKi?ntG9{6 ^tMe(:tIJbVH\`1b WRE݆!ɜ ˰&l+AT`sNQ@%[U DZS_77Df< Y$)>AIVbID(&#]'r/np4#BD ވ$*2(Y)7_A" :ڌ</`Es:K%+l T s=%F+y OEK6a)BA\Ya}W|^Hsy(b㳵u( "sq(."Kl1?&*gŮweUa#̫X!VUj=̐e9`s"qhBd R z1dQDFK\ A:JG&%pƮjhQ/Zci6(p `RcS)K3z`8yl5-h8\azj/nlS@.yj|h-ϙEF fљaoߓ|:#c/jx.RǰNhQfT>_Q^xLjH7(0|wV/O!bs胰TAtv&g!1~ Nx\[ r5K<^ ns9̪(n GVICD1-r $е'b7O|CfǞHB7YP{54r=B0XnI#E׀q0b]koF+[lb bo/4p.YJCIiYIJ E=\9sV AXaD9nKM)lUS>ȉxJNuDVG_)!dߣ ~ 62.o&e/oJUD.F$խP8y_yJKg{:zm&FD7&/ yCm>Gg 6~FQk62} Z2GC/VA޵󅍮a*5m4QٮҒ q&+B#%)S-yWpr{k'$ FyǛrrY.@WE5ƴEQ9 UjwVxMAɺ݋ӂQ}ǻ" pJ55v cNHwu,.x4 1"58_w 5m4h}ʵ:.4BI9ŤHR',K,A]ViJl_M$7ZsDfgۏl"fҫ$V{0!V.sJ_(„8|eoj*K㯫jY!ޗ[P'. &ڔ$TU~{YZ0˿`a{*.]a.:5tѷoB "^c&\J-}2Q^"FTm-Ec^%|FW}N%Q*mlvht%{Ud`\qp{ur!Eݩk/e kW:A l(TVqWZ]ҒZC@ ZcʰmAoLAV|ɔPn(洼SZK+_JdWasިkmdͪm,`8N`R(Ł ZXf`l i|Nz(L |q/ Fɥw3 ?{M,J v6Љf%yo zAĨu )<~>)zu)jl55j@̵aso*[=xV{,d0B_ Gwbfo+evv3wOIքq12.z-'dEeqi7&TU]>(]CPѦg'"lnЪnnn,pZ; l퀺BThsB\xceqJJYѪT4dH+ͬ1!*f.ߍϣؾE>& .ԉF?S;.ل>ŊGML=ٜ1*> mq>f . h%rJ$ H]TZMO<WErFJUX5|yG#[X+Iv$1a'LE(Oִǚ s6իuXKItd&8{ pNohIvec OWĽ:.wr:ulD{󹢨vw.(_^U['Bc?zX*/_^{Ե~y{,2md/\u- #۸d+?$u+'aQY>B+Axc`G:O[QꋔjP:)]T[{ (^r{:\%~KDѳ#J0݀Ć+9{>b=!3JሢE\P?nWM>Jx,fJ$gU袡Xi+xu7U;t%0eDOwNpVCG؊qkc-{_Q+^}~te69Uh-PWUJbS1Ud4fjYJ`2*+zY [OqG|֯QAxWξVL/oGӷ3H7b8ƹC\r&ϐs>y#-O K?֥ͷ..|A|\YތsCL'ǔ&a#Um#4cH(^6]N`dwnw4h -iLZD~‡e矺gs;x|Ӿm xnJ(빗<)|]bύnu6#pgc qa{?:?A6k43bN$i߉4$=ʆs|PsL@xnMixK.V8%)ReR]k)4C};g F=\S3OWާMz s>Yk0tfozs~L/{WWw^!Բ8fkPc:Oi"( Kappc|๴j,F4ブCRƌeFSPJH~b  0["fTln%\V 2I"BZkŭۓHD"g&W !CК6*(Mh{1$z{"Lc7{Ղ R*$W"䥷+\'%@˜C\#3ԅֹ}@xO;o'^Ρ{r5ښm:viw8suƢεb\ky"aϵ(\Pc|օ Qa jUk$js4iD5e]X:_Y]83 0Tb!br'2$Ά/@B-;zez[mLmEb:tH:ь$0R#!Ƿ>tOIO *:ǫdT^ZgP񶽇} LRr]npꦆK'~|$-dJGuy%A@~q9U;QXi+V=CgoƧ7_G @T~o[o焭uxx=Ûއhَuf}] FOϬ@cofxOaNi,xi9O/*~W@ZgG+ޅz鱽7GZqjL(с<=3y+ zv42\re3h2ܻN'}S i`s':"$U*Xh?4s'I_Si<5* E(8rTU'"k0s뉡֧E#9Q! 8g%aQ HMR!g A!YRZ9.mԋxރA$A$yUAO 0H9??$tͨ7qMY:j8O `0MU W|RdRV__r?Th\$͢ ,' lK</ E"_涹x> <8[^?çE!Acgst2{C׷ <}_'_>`<.w`t?wgf2`4\fׯw<ʶ *fX>}wiOĿtʗ ))%.cךAR=b{tQaKtwXJQ&/,w8u18A) 2i 6 (TO*΁˝S%X@a.mo'Iyl 8@'!bR2rf5Ix¥. sdEXz 1fz) 21Rc)R   AC Hi )Js?x$f Ry5:/fjQ$@=_}(3!ަj o;a0WyL87|Ao.%33vhhLݗ],}B5|dZn=As=>Hs/׈i|$4H#t֥CYCZPa{LL\SтE*$XW`TC왼n[جo-"q,9.Ͼfy<+S U+E;_oH2r(_EF|OfȜCorlq]$a7_igu{8Blsk rF !BP͍qA5ZV+0hVdi`.@E%^ fEIR<\Th 4sIZ.xi,\X`(X49ÑDY}|0Φy?ZxAc_ʷFb7Jmh16  FT8.E&wݨ>HE,N47B_zCux #&wGJY\?F;~7CVJ3fLIυqe' -1 =Q%ڌ~LRQ3FXѕBw[YQ, $<3Ш8|.z9um\Tsڨc ADL1}!c \rFVcRI)!8<2OIPad<%gH\500+|`É.|bVٴ TgS,Unqp(9XNbm;){=lNGE.$' Ȕ64FlSښJVbB:kPcBVq Iqtsd/;<0OuT mZkcoKfiG|qڢMZ o&)[sYM :6LesJe RYw^zaCL*KKZ0I)TLي΍+`U`2fl@CnAoVJb-Rt 9C6 \XLk{bTN1yI^ F+U Xf KAcӵB-y+8C ?{3/j 85*EJ Qƈ&^Oس|ǼKwe&jTGCm'qm)^Kt$502읣['tM>d++g66۸z2/v[){s]<:!BICWON`s32'0)vƎ̍G{' =OJ @WfO8H{{ylP:KD]KH{ʶiHjBY+}@Pn_v c[Be|B[$e3mʓdf=3<`M> {DauQJ♪,2TFu*(JJ̕s. 8|>_{+gbwxYbki `6&;c o`F 6;:ϴOT΅^CA^š@[Z& ;D/ ;A0v8`h,u%GRinm0$U%hS+\UhfP\y4\ZbEGzk=@2326-I`xk<:Tvu&w^ p"Ui)8FH.{kKϲj6!qDmvϰU >rLȽc"6b:PhM`ʭMfIDTA[C﨡ZJ'Y[-ẳk~o-)11#N1#>Y F=F&Ur8wHG4P0BLC0}ux}HEۇ817wZaTGۨJעbƈc kT]??< $-1pYA6(EaI{FBJE T2 xdZP#U4i}}Ml0Mllq`PD_jG4:7 1Fף9Y=;LC#B!-mᇼ*Lɡ=<];\ÍSzބ 1]2|L[IBz궼]$ g?p~|/?3;_~w[-h}5--o"BA["zk}X%lEExT Dz<'>՞ FeN2ʁuzю-Iݢ}?7=ˢ$TƮ F-s{64d-iaվ#fZa]w)V+[Rտ|?/}쭹\nlE?C[,)[c`Sb{N8 *A>#w(A'xbuǕY8t\kDIEEqv7 vma/o ZfK铰/oN`QSbC6e.L\ LhN`=r~'t"(qUFQ=Q.n*#׸ 3)K0F^8a"D-cyiHw(gb4BWbPd5{ O<닸8}_*;Y|ѳJ*rQr~yFȑ6cn7 #euc&LpukAiխGN䴖խ}-9!fL]pNi0fzLӞq&(%ސSx7=@Qq>>/Kw0'n FInۈ!~ޘۓ6ͺZkN6^Z1DJafNQR ){O.td-LEjוL"R pnq$=|&mB%\.~yFKݠO mC=OB40VKړxAj'qy?MRef NWr/6'`>Yvt4(=4AZI-+V1u|nt~_\ s1 Qy(-Sd%X(nrE(d jP1j|n v j7I@h\/WԅF O)-%`+;KCF)[1nW>"{&VK|ɀ CCg< #痟uqː):tGNV;C=|T"J&HFHωV!}et{muCN6=${my ͖Ro!!] P9e^|;)&GLB^VFU8rP91}К1v5cN*EFה8@F0SD{(bv0Cf*e߁唸0tx_2S(7ȻiD_5N!o޵NdI&@qs`Ћ24t0G,\q$MKcұ݈]-;'':$O:.:ϡk @h|hV+7*TCb/< K4GYn:MApe,zwJCRG)2"tBI ^EI-kkT!\j]o 4/3N8UhӗV35V+ґu}io.7B{ 4@Lzl·/>Vmh8ebtZM|k.q|UIu 3%g51lZ[f1۸e_\k'bS?1Vk7 ?7>N7[1QzJsRO`xir"k$G'ܫ`BSQ eLMxJ9%*8@P1 HZUwl2qza~;n97 >-Wx.??k-88XmάJL2WשO`` [n,1b Ͱ}?a\#(& 9G@ ,)aQIV !Gd'!(D`eN8ul4v#r'p"HRi]j|VT0%Cw5ݣjWINl2Њ(>PMzc_{ qøS\W*kàOւ ϶'!s!(N95%##X EFMxjbi.8傸s&͙+}$c)BuP/%̫>yBP5 \HLY .yq&ibxjjQYfrbj`=5Nd4Jͤ}o*'GeDk=@~jV:JVlTöiK6X P~ U|#c6ӻTZcT;ftzq=gu?L\~n-a#rM,:Ȣ#lä >v{%/YJ (kډ^ug/_q(-򦫽,ehsA56_krNZٿ/*}T fEmbk02ejYqtЄ8tglR3GgCi.Sob!LsE]",i{}̴6"lM4&QM|2auM=b; gS^l; ]^ >EoSr-vWƙK}߁2s95ڑowz`%l'oNbYvkU/=#x[ m;Oΰ.䳺&.Lh *S[3 ЯήhUq޷[0nW-og n$ަ Qhq<5Ħ6E±E?ؕnXRRd#z%C-+Ka${w㎇\ofN;Kl4Ͱ|c& >Oː1u(e['l.116s\L(9}9Wv6vs-M=B],,(Dw9Yyr# i qܒ " QR4%Djlm9:] XGe뙋Nd|~rR937!Aױ߽N wg`Hٟ};?kZ'7,A5 2CxuImiDiw-gJd71:ӊP @e XH}eyoSɀ;t 4z@T럎az]{7T~mFe)k!s*hp&̃k0Ӆ! 9[&/.|!y3=0! 5P4Cr!0aH;.^߼}| Oyx{}rur\ݞvɇ>ޞq~=d3^^\ | )xЀƔ~H12hx`G쨽`G쨃5mG tX|vE*E eQۜŅ^ЎZFQSvdQ9DcL͸<|`B~!ۜw}_­P( ֜2{yKK>w)ȹB~6"-0*Uv9n/Jnu@nץi|!ns%4(ץK0h6>P~/WbVp]9"r]Jnf[%J0=6(+xCIBTm# `6ᾔH WkMfd8g $}dS1mT6X_֮OΒm܃*$DؔLZw w>Q+DpLB/ůL]/,Au;?0˜2v Iw f΁kە;Njbj`Bz2hqLep!!_t lpƴ$ lAgS1@0+mk0.ׄ/]H%; 9I[Ⱦ阒7*β,(IE|œ-9{skFK*Ť~*.!F83_ kM* vAsfp-a9ǥ [STb̺jaǰ lLȻyI8ȥGԛaN`[Hx ;WoPH=bfS̳#uq}Z|e7r;\H$c) /o eWr2r-=X%m{sp1mwۥ2+ nA]Vj{6}F)B۶͸g!AH8`n=Q \nզ>)?e 63' U0H.;`łJ؈ a$@d[K$l=+^9b*Z997Ж<9Ŧ5[bJVD?` ILDEc䌫~|r""V1i[ #:<գ4P+ l B d6D8A*O#cG:;Z;abԔR1 ;U L٥o\ڗD7K"Sq9_)=&撨NZ6[*^z`<LB.7kF=?Y 9el-e1uW1wL4 g>u3W1uS1 IEft5d.7ڰxյ`ln-iعc)GܲH"-#4d'm bG^ky +->Fu墻. WkS2D+/Z})׮MRPP86`S8"(iٹO4U|ru6g\ц뙋NTxFʓʙ4$b{A&jUuzzq6I]{͝ygWN/>~WZRյ#mmǏa+c#IgM'l'7M9@e6CS5;$13I8FuMMROO:]>8tk8|#R 88^5!e: ϚhPI&l7j/b (?lF/sNER|F/ hP߿/v.ȿnzۍݸ^qtFW~~_2|E}br{.QD/99:qH HNN-|u']bwpx:f8(L/.w ~#}Yp+Ʃhr=,ʸ׬U>uyCluy"Y͋ `'GW[Q 4Z(}7f)`[AIC5U򳗃^ϴECT釟r>Wå:N|׺=2Ց-;vp~r8U<`#lu*˫ppTFʚihqL׵{zʹ()~5ʇ6?xp؊^컢EMPWQ,[OlɳKvbgͤ3qbj2`z&`~8 ūw9=zմ?=s!F?>=(6Ԙ}şQwo?\_tFa ~ckW5yvRTabs3ޞh#h/iꪋҢt]WTIWTr(.zm: 93C0E98?F(I9܇EZ%3߼iIj w69{:$Qoˢ/ړpϏIp_wSbuJΟ!ƒ찰k`K8;-rbFFC#BmhMSJn:59z{=Vu+v%e*1HNjfYR81RiDf4hMЂA}ls~X:Z)8c[ Z:f!1Xf@U ÒyǑI y.TKJ )c(4{ DՉ#v+lA7n3 !HCc-}1ڌtBǽl&'Ck*E}6 j6YP RQֈ =ZhÒo^kkRDx>}8x`@ eaNefr6iɑ۫<]XIb JyRȡyp v)kD / ǹfKAtO&@|Mfyxc6G\YLV2YYvd)3YUziI`b[8,wEYe)bsƘu;[N2/o'jh3<0w VRH4_B(08KI6Kc ␲YH0lhlHSmō§yFf0<5f!A8 Fw 0= 3Jz7L!@TިQP +0>NrE<+#R'~10$툤cƨI.<828&qꬩ䠱F,`#0{1&E,`cmMcIMpDҫ'L.W$y{S|1dh~w vIU 1@)XB*z< tPcA=uk/ /Q}1 - SQ0R숅]( f>K`@f5|r[3EN(G!DpE5HaZMk4z'C\騔[!9|TT:Et:@y,p~?YU n [8 '`ez?G FJZ90Ңe"BI軛vҳ]ΞHHMizu_$V6o~n4ϟ׳G^(|x5 s+j "2DyCmo 8Ɠi'|o8%iI`B.0h3ɏәo}yK -)>FTHS|_PVj؊RkryXdA]Bg*Ti.n.NrFCXW;gp[yH qaUF qPM% Tp-BnZ~ĭa!q 4gag% |Z`{e/8v, B⧌[@vqXr$ ^ B"%ΜCz!#-s"6 # _*zgX*3̈́' +bDy%0]M(FIpkE/R6DP*4h'@}'>ބ*.%PrCY e)7sCւeS IY=9kFL _~-FF]ƣgi bbi_iV2+MBD%dHhOtQz\@ȟPEch{6T2hZj,bl "F*~l9&[5ZJkk&(& s+9+ U0)ƛk&jֶ>!ްʫDWT78YB'oVn:D|gE/K>&4a0M#Q.M^d:W:K* c;x |18+":Ɣ)"Rg"h9x@U4!hAbպ7wU X\r>Uy@h  'L9K@d)Lt)6& O'1 @)#(iS<̰`Јx'r5ɱC|)z 81>bµ%pWCQ*%YiDe=,y saQe,3.8Trb0 R_\=nΗ7Wl.3 Kؖpq4IkKDМ"̝| j aq[lW /^Zx/bRUoC!>ITjC[:N$. Rߎ(̔3%,( L+ŝ>6WJ[TC'TuRC:dЪB8VB8hػV %/!?Tzuh?| |֦*v!D\ɭ'Vا'->}m_jI$lh eKc#}78WPQipWg 4bD]`]~i9| hHtڀa'e;b;p]j f">=ܻkV/Os~y:ףwO'Տ*xh˿LeUszTiﭤwˇhl@ ]%M':'j,uN>ۡ)0.?^h%OwNiHݠQy4 l.5GR j=t!+5կDfAQee{G1idd˩l:β;ѭ21m4Z A|]Lt0&-Ĵ:rR>t%+YJp: V=p(ㅀٗBt/e.&_~.0rn[=>UW0iߌ{/& ̞2 `1l[0]_Ϸ+3C,n@?g3;_>Ov-K\VH!s2{{v`e։! ?EUA$ U4Hy`D[-/ F-ـ[pPu!߹pkzP_$Rv^]O~Y.kBę+T&$n*'`k?;@jt<햠CRʯudžpsaP(ONy(oB? @vo;ht7 9.lcCȭvtiPg=Gu tF4?B@s7}aR5&/TK PI{B qJXwS+ zZ0i/Q밥": :JSoTt`Wk9#*uqV4Bn@G6);_ňRQ vWu׾Q5Co {τ.72FLث?q3(JTwL(Uł\o7W؛+\?-SY>⢜{orbpqiy3Fv3! I0߮|}2c MͻErXyt?u'd6.v~Rrp~~]|qzo>/!plD Ϙ) 23I JtFm~0t@@ |tLC%P@ VI_BC!b85 J{%d0޻I*4C pաej&~ Be0/ :_g kY [:#Ǝ ͸%0Ճ y!%pPZURCU ԑ]aRY>Bs;4O?w3UB,uǮ㭸W.F_wRI{oQ_bj1֊/Ji|gkw&+*&JwS%\/6&&r4\ oMBii] x2 #gX % |)M;cN!q0Z͋VDg̎;v]ϊp}X7@;j:&>]3]-E? Zv7=[;Ϙ!MxӤV-YKCRMiL.!=7* *(#&NMJ-_ָlrh.qLǁo]MYƴ u~nc/'> [3Bq(kYx=*d-K%W.lcgVCY͎V/H1tTXfyg),h+hd/KiܩֈDa!ed/Kn= "*2mJhˡCM1L9 4/H3}V9I#.R7&f)>ua~0ۿUsG>N\+41rn\-F:[iF̦(%-S s$WRHh˲DSir k,„b$I\Gϳޠ0)\(/l_@_tSiNqǻo{/ ^Zl:N\[wc8| w*?͉RjL,V6?eN Æ  .u0iPIJ rkE]3+xx3SbͲ &IJI(a' 2RL^cǰu A43:3139݃PciI&Mn^c #TI"C,2  9A8J8p9rƱuśw"7a:5a J1ZǵSVPZ#v=hr*)(ƅیNvu,|xk9r09wʈH%#fA9MLzN*g9I1Na'R l3ǔ`,,9%΂~1,J?o/bjiCZR~W3./* PEiT5!"X'3f!V6!Rf"pbRK!)`GG&Q#ͳ̒Ti5]G"kA@~q:- X<+{!^p p?[x&0Djϗ¡M`==fw`ֺ̜ssKn^ }o_ ^J"ޯqı!| 4"33a`)N+͕K%ϯҞ7n_k?JOy5 kee\Jt c$6 Xl!s#de ' _Ӝ7Q*WMT\ta^^ ;+99Q_ s}KWk&S, 5 Va볁(7 =a[Kgwч}'{\}PPO`/A"ay Ī5l~xRT|cȦC!:`s>Xr+ўwgwC^}*&So0Vo yT`x}` ib0[cwYbҢh3;]L{WYw.|y]Cmwnp8FȌrfɡp%evNeKA9&[o5{~1Fk\9ewۢr1RH>Xse}b$B%]%Kg:g",d>o zs\,d\!y>{͒=fwRa-ឲO$xXs+1X3c0xX.]N)%%oa2*7t4e7wӲʥ<- q*>ZdՠӘ >]: iI.5bHӥn Lפ?Yi h%qtiHIT(P MҙZEAB?y2m Ftvu k96$ i.Èe&K)LS!DĵԂD`LJ.6)2ɍ+1-V ţ̷Ήk>8:,uE,;=8uvQ9EjbFa JH)DG fՀ0V 2A•$?{ZdrV̖將4SF3L7 ,Z.YȭwKEh"ll*۱0䪵a%UT7f}41HZ*Ktvs9"5xxJ&L !XTN?&RKF#ʸWj> ,P;YP3˜K;NrM %:Q$7z|s=\Z{aM|8V ]6npCIi]svوYx]rp'(0A.>T|)b'cG"RԸ99ǚ#3KV?o/){\`k79rse?{Ƒ `R/}=AY؈3B]-/3HYjR$U&f۰eYWu.UΡεK>:Eh3mG+%u[z{I{c1$ykʦ_Ȧ:ہًCm~:2t Ƀ^#x^)J xfR4޽t^nvswp}Մu@Sٽ_"$/50siBhTj^CFtb:"h-PX#KȔz2U^ƛpT=ݟvv[Ghܪyb%V%oDNⲹLKBH9;\ԅAj%0gZcZ9,ʖ#$;P9D*_ 텯k]DݿqF̮?+ iTq~co^ƈxӭzJγ &Jh#FMqZWJ&0!9'+dYG Qл )#[*݄¸mO7 tk鼀WIXDi*vIOSA)Mw?٭R&9[|Cq_?/Ozm,чZb6|^* FK+P''ƅd[g-D튽f%!d %V"Kg-FhFV|W)EpR1" ̹-XP"pLR7Dgm_\C49EΆC?۾w9ej߁2(>lȪm=:Xv9VwD^-,6< K|'s -6g0`:g ؃Ӡ}djW[e? 2;"n>NGNS0!7?tbZR aިTfSٰh`f &Y,"ha9]KD@`ݯٛQɋȐh$2FR0Dawl30ֹdGHНKr?Da}69 e&ZsT- dz΂@m9B}`5"hX_ ofڴ ri1UEˮ%,aJ ǚ=xWfOGcwy1)e{Mz9KB381[7{sm]>"R\oU (diz oCng>{vՏhQMUcs܊>%t%x}Պw%xW,5fcJ>tѾhh9 ,`)B3P<`Huψ,78^iܺml)Csm3ts)\Kp<^E4N9|~Ձz/ut=# |Vl=ſO~zur'~ p_{5ؿ?yz?qGL Tʎ7=*zD3͇I./ms|./az7]$ &0nfw~hF`tJjJBJ._.;F׉A]c+*R˺ili9_-5=npJ*.ר  }!jDD>u#M,u:@~ ,oe)JYؼOB1< , Z/'3sWt~ܰs4Дr:-lB fRY~ɬZLQg(wpM8Jpay DPcE FKAƍ!`,2-ϻyj9dr3ϒRdn:`0ea0_3Pzj0?>~ZT 䓱blO_ZHN~B`R3*"zmu[m+r2!7ZpI_]rR Y>)EX%m֗ʇxS٨N\ӆTŸMU1(Sh%]-IPkRv_FQ0d}g^?9*t flv;=?;+1zޘ?3w óϳ~?9;pIGx|кjQ鬻v*g*!8墋\~t3xx`1t\sVA]7[\G3l7æ{aӽ0H07s]w*~o( XmgrW(;XFm FGB)\șɇO8 <, Ϝf4Q5䁝"Ne|q6"}Ba1s)=5&(!.(R (PEL%T ԁ4 7ÃM#&s[^@U2GP &"E'-0!5=HEJ(RzFQNX3d-JfBCQ$/١%Rz l%)\^s/?& #+#2rP"H͕4h8pLsaps}%BL!R%(h1D82+o\բ6[Zպ4ETVښ5dZBrn+WYAVR e/| _—ȍ(!l-6jʔ5i8Y Ӛ_ԉ#@)uT+k&KeA?>VW[aP/0?: J fRA*  a ;=>>k9%*/BH|ptD#<y|qn.),19s"p q,<Ì)$8Z-ĘcG%Pk1հ쐱s.(V@Aȯ0\'!D6*PY0NKa Ĺ<:j (IazGbFXnr뮙:zT퍽y>0., lXaҒpO] ƞ.Z\SAXUV1|xJb%fu5 U0b +lW0sX!P=@{ H A.- lQ2$Bp R)#C!`K Tw)Fq&sfD!2^Z1" ̹-XPA.`(ĄQhx\+#r{ ,( + Ĕa #Pp]Anpmvj+$F`QD* %( <:04|`=>FJ[r+TQ6 hJT!:`Py-,e=v_.gWj_f kW"+$k %s.) E.{M^z 'IhrQI5f " DW-t_8e2n4igRA$ç࢜"̴ғLgoݰ~i_1x1]NX`DT`nn<{[w 7Ƌ}y &_]wS@HAI!'P)clua8`XXT`c€j/G4kO%Zh(.\/V3arH0H$D 3EM d-X2|9gad@clZ|6*t FH `At`">DZ%ήbb M`Yu88|1 Dr*cgL P:W>Ik#\ZbX:o\8D+-2L怉wُ3cMHN wj*[g_o`bS(rmi9il^VBk5KqA8h؀7qk\8rV\L?<'9E}&ΚDCR :`UrF$+g,&2zWCvdۨẙ(-,S|VF)`ݓwϑj.D&̥ {qHII0 &̓~3;ԓ]`r]]}4О?,-;g?/O!.T-OP\` a/S8 qGq>xy$ `S! |^<ͣD8#K'wdF^WQh"NhkmH1_8` v.邠elّI&@$R"ER`2_Uף G|ҏ.tHjtC2s!vZ=F9--E#K"ѭbvRR\9фpLm +c!SFomhKHb*04DzĜ.2<`y{h^'z @J_WC=`cNgh:8p^Tt-%G3gt7e%zV!*1{|Q 3%C-CYm!0%eͷCVm<$ hΪ7^OˉRS-99_ҶǪE1!h~u TVLގFw63U\TN벽0#`q,ơj]):n$ho 3Baa Xd{f!+Vu>QnlZ65 VbWx#AE>"u$n(NiU\2-p8E/hHVcUb'u`EX:Wd4xFQ|"`u+mQ,']rևh@ZH"D J0xm1&j+mV-(U|StCi ؘ+IRYT}!x2[1t:K w"(1%v<ʥ$+Z*Ai˩N:-qL;E1PX\F2 R`ekn$}QS,Q:XZ"f\JìEB;o8ĸ]y ATHYRdB!j7rerWCf.Y!ۺ3 L8Ģa@iIA2nݜCƔǍLj ZHL$IB$ [[9*sr֨tS"VP_~ {mb\RIaEߓFi02q/MG3(C'1 L:Uɵ@UG9lԪ hAoj!]~hf4l1A& 3i هc@ WJ؎f\ɷM  Ύx+E1DnG`甽vcWҙguKN0K*Fb*$O[IEEUJ,qIRH&% ުE5AS>{|.OPSc֚ÇRORR_u,] :. HB!ʝ,~_6 ?{Y ~(cbeX2ʭRӅ Jt>?ě"=.ӫYN{YY{UGAW9nk.E//4v$R 7a= oXzj@7zпۻ4A>廭zC$;C>F?=WtU|5.['^n|B}r89>g_ h3u,(-Z]ș:^$q1OsOw\ ^LNcExV\ג| sF]B g-H]8nӲKֲ=|ZܾiLx  RK(r͒6j|4nšܽ[}׷']Pdžrnjp&3f%>'6@R:%RFJ"ѐL 63}k*)tRkSys699zMdkj-4gde| XNl}/{/%(cVOo\KWT+q~p8.{SȻVNBGC#U>yu[r&]@xT{஧^>}c{DjoT>^hrkU-'_V-_ gt//FnC (_/b}CFy8$ ̍ܐfsxثj4bbb L0t̀sѐ Zà爡!Csx~I8\K3g n4?7v{n0{ٍl>҇ar:{ Ӄ$'jT 4Ss&v7 um[\=H6fD2 3e*PIs}H  =OZ}%GG|Fn8uEkH9[6\{̓{hH Bj>e=Wݎ۽Χ""rZB}NFCRB Uv3o51l]`ܭKpj:YOa@|PvBK4RF >>ab#Lq6 ~L[q![ ѐr1!9b,_n?NbPxoM7禦O٧h?}m9ƌv3uY8NWiُ;HⰥn(:"í|A5Z><c=7 ᭦-h&Jp P8ڢ!4`/0j}߉ #'H5$HMN A ݐ =ጙRSZЫOOəy.1(JSYfok?5.hX+GssgqR~|}/stsڌw0mW fPdosso}W5O!@]na?/SEwB9ʑCAw,a4 k>IpLqL8( ; z78J lN4pY*_9> J9 P&DL*PE:f*Tz;t_v-p DW7)f=ڄEB¾oN_6)cD:JKxo1OYy0-n6̺4ffn͋Gͷ},ӫ QrvuMJQ,d>[U ·C<wp@,kt/tq]ŭˑJB'Hˇt=wT>]Rus3$z?ǹOW/OW]ḷHijkLxv3*ġAd^30Z ~J2jIc(IT aWGlMXcNp1VvkN+ V(mmRLe+{i}1-[nVػK *^ s!sZР ƋşذKrK iJ 5Ӕ`Ne# x dY?O}%ЧRoÎȤvvvv)GRyq#k`G 7+ ]˹F:cR<)0y3 zF'wM#V12bِ1 $\*zS(f^}TDHzl^=6gs[h-#?یjp)Ez6-w֠xA^wUbjM >A[ <_9[q(G O9˄:;V<{9hTk:H߃ψD\qzF|C"#BSY#gk*K|oFJ=ZsHךۿ9.PΎS Ա^Z5H:C"ju~HMz/BW4A>A#>l 8ߚ/j4Ac"PQ Ha4NM93=տ +a[95ݔj=߹kNءEm1DznR3ϰ`88^KފTzܠWAl[פq·K&k:{ĺ,sCe孌Q"1нdóxqFZx>HY"_P?.[y@4g.SNqJppyҽeV33ŰXƣE(X? _w|PR_eweGt]0sF@2"Č$8Pe8,G6ɬG#Ls@; GgCoSL?x\ǃ6:<.ѿ|Lc@b~ѣɏ/^=}o>y{td4ΗP-յ4],ӓ!:*9r/?}N>AMѫPl/] /~ WyH)-Ggdftrq?q-tE> .˿VQʾb2:\W(/z$,]o)y~^ټHV|:3g8/=2 n5_>M{wO6q-h-ь[A %mRjN > 됊휚}@QxXR%xqc![p84ӓݍH&MF;z7S>Oo@h1JfR~aˤ7*_Fym\`1"u9E)rpЫ KXK5YRPNQ:-cqPh0jmw9+gކC%ܳKXs3 |5\]p ^qJ&usZo2/{i@ OԌ\=+nlf/8zLJKeypƐ eT* bveMN2a$Nzk#LR9Wiŝ2xDe qP _Fkb*!B[qkFj$rN6i|@1˜8ŒBH`)dDŶg7{r4 x5y2iaM )zf2]fAl l:<9A[RAK  0ޙ@,՚HZKbB*<(ZW0)j5*jl>? Zi弊<2pWλoM`J5M`M^tva&9R>Zh(Xpj5#["u$%<$E cĴ@;pu'O?mvmd=&p-K Ot*Y.ށi vs,$7Knbٿ~x#?Odm ζ_[  H ;^e6]\ dp}!/K.˻0͚ UHp;ɚl?ɥ8 >}d~`o!RRu{V:˜:C~STIX-砠SQse}̼A5"^2qe]E q24qvt棡j(vqJ!hՊ_Aj[ ]G" @ntRt%o=) [~q9!ª-}J !|6 l7% M`y`CM1Nc)XC7X`S Քdsݢt>6 bX@u+aވiY~9IR_WޥP^G Vo1Xp] mRQnOU͟Z a:_ >& IG] nƓ ǓJ'qO3DG3x0|jC%to!#0x >}ff+;n@FY.֊y.CDZ 15R|T{G 9)Bj%RKJʻs2_QwU7ϛ}Dm./"/?gPq|s׆NNm]anʞ S&hXAq;Rh0i~ZϡAip) 9[5Jei}Z ݜ&YI!!z*[~MPhjśDTvT1ҥ Z kɐVN! 퍙!^aQ\hp΅vJ_;UY%wq+b*MjMJw"y&l(.ۘ3깳وj~tɝ%hI%&cGFwi~zrANvS]/ڻ}Q%oT).J˛?X!cHqv'/ε_UH{r$jjxSC:;cM{;`|S:TFfCBi< Y BdU&뷗OJi/F<_11(lVsȈqOFЊLj ]o|'V}É)Cg Qث ] ,&x,'`Q)hEϤ󀜂 f$$@ApThQ:&Nq|P5vL>^ec]l FQ`WR9$zO1Db98Apj ) shgנ1]Zlby̍^i`cyp[ݶ3GlI{%Ơ5vZeV`l)Ȋg ?~n`l.SGڱ] 8k ~&@+Oc~d/A>;] O!39sT0=ľc…w`LwFE #-6A@hm5Q=c)(YN9 g2b40Ti{jOpo yf4-M+RbM\'l-X0jno\K/e5rx Kw10u4t+'20E% kjӁ%iG(ᕽ{{ h{RSM5x3?5_S; 2B@tW}Pi7 QbP{(vBҺŹ Tݥ= -+~㸫^J)ZAu庴ʘU *6T:0¶x%ݸ'E_8C+BDZRߡ zL%wqȐVՎju?6_?`ӒolLQ(5 =&{ٿVt.Jn^LF[u8J =uA(ڋ3Z~P I-' 9 %F@{a {w}s+ fPjo&Ug !gnSJQzJZЫ80pӒ⯵8zI֊i!= (ôSiY0-G!E 6ZI]Ѭ*W!/<_GiΙ4b ük 'ʊ G JN{B'!ƅ 1\ =Vh"*EbJLDT[?Z+iwEXyI˺`:Ӑ.8HlՎ|?|pO#{$WJ}IW:W®Ƴg!tˍWBcEQcU}PRŅԫ##A$ΦڤHT1łEdAwŰDׂ+hq%D)EK,E2b`P2pD5li g#z&]zQ{aјqn2}Z kҒlif߭lWb@v4H'p2*n qQ8FM:[Lb4\3x$iP'NKyh!0d ŵ-*Aa4*Y24N*+FWAmwmIr[P6 8{8lFڕ)ExWMRҐJ=!0l gz*24=r`q\KQN$r- pa=闋sh\8\xB"¥ &2]%N'BBHJ䀿\4H(@b>脡Z&as$Ɲ}RDDg2$CN8#$[CCpz4tf}nv~ZrT4Q1F3P(GU$d9:nW2erh * w3/\@|ERJ0pww6-n-Z.1ݘO7秌05!l0TBO }N~۷+5^fI@̕J?sBw)^l-A ㏟>tGJ:ˍKf "߿/-75\W1#ۻ/J7\y %8t&`(õH yiX\܉I?;8iFjhCs4ۻ@h?CfP J&d J&d,()Y"jK0jlOD{<<\``ι[a(dӊ"0K&a@3b)Sbt_~+R3Nq*t4R_}_EIR9B(N` H,nxOOذ9{9~Muh9*#YJF_P~\M V/p C^)D륕8et(Jr QyoΤX8FJF,q@DEFK"ڡZ _ha` =Nvlnr%ɇL뷜V*Rpy{&߸C"o89Shf+ֹWKnylrSQ~]s[y_N۹c;)ТfM3mƖfB vpW2ׁvj_]=f|Aw\UBiUBT ~Q)y_GRwC1NGv7Xjt^|/u4t&xդդ˫GWc}Uk#CW W|P]^P(o\#Lm6. l~3΂򦏟o̴("RU0~(q8򻊝A1ѼG٭C@+P'Z=,R@ճ\3ƜôRqcM<`Bo41iOP/u{mA *#^TgNk4Ʉ@sa> ,IL&,DE Tjaji2v*,nY#5g;hT &*EA zӲo.Pf3ŗS d)tGC{a<#.KCe$h8`=yzA^:Qk/p8(qDk4UiE,?KqЬz? .Z޶r8Rf: QCW>ѩ7qF$5Y<`-wGhϾ>_,8/#ځ)"\Wd!_T}Xcm&dnN3x9Yw [6%?ɵ*M}0![w-de}X%tU Ugq䟬quҳAeR*]k>OT?p@7bgˋ8+6R=̺6L0b\A||ydW*0-ʐ΄ӥY#ͼU Qb1(;ڴ]Ot8.FEC VAh4_ Aփ*:XeutN *0*FaeBo'(ȴ.ҍc#*:6%\n=MX{ -JipsE}_UCΘoQGB¡:e(u\A[ }|?[TA:Z:W)!TphP׷_RhQ!!Ofi,9PNhD#2M먲[Ía=4 WiI%pG;n<XbKp$7K&.p<3zLgZKWɷmBn(^[ vu(Фhdo)%ew4b|wߌ\4}o/ӗ|D6/#̗eKo<6NB!:\פ0<1G  Q a&"ߣʣMeD[tG43zD(|7IѢl88{BzKe|D^@9H;qxk fOXf7!\=6\%}" D̚jglBߎ a zl l(FP.[`_Toh*ʖ]DW jQ@8񦝛uCTPgT_mgT_ M%POiӦq 8_ 9iv} >TP3.gځT/3Jm? Eg 8ErQÄ[b%*9aG @VF3g8I%Ɯ@gRyϝk=չ:הSFw3"S*gG`A,?qd~F4K1K B2.%7qΒbL qj\tRo.BBF̺X,=,32&=>c.WCŊ`8CL:Ly̓>׹@o߼*bZ9$!*fHT' aZ̳RAIP@i0ɧ/FuI y$fpc̀?8j,:9)8[ ,MArJT\l(ŋJEf=smTQ*}) 1 ZP3HԔ:6r9ZƠ^H/cW_JxWfL`FΈ͓C, rhӓI? H)Qgfrq5/uOOIࢂ)xDD'.z7p:j$.ʬv7Lrٛ7%grĒ5.IJ@0%S"vPC w pC@z̰>{zϳ쿝L]nA<ّlp曮M?X۴"wo9-{iޒ*"M㈼g+'EDC]v:Fd>\=^۠00C `A+#4 b8 t+H0<(J)Y A1紶:Dm&bӭWa3gkR <x{Qk9HրarT2|nK42X"mL^E!:h|R Tt§!.2lF0b0cb9 *Gw` > j#sStRb԰fy[r4s?D d񇶷+ i%'+jv|D+Pfn~\Jv9aWJ+t.mWh u)tjW2m78fƤޢ`z';wIMfWbK"5 _0#ƉJ)L_t[!FBϥ3eOΔaG cg8ۈXeV.kaʎGGwsg ZYsP5~&/D-lECgoP1{qk/25>w\oZg7o(9c"3*[ v]?iu~]n.ߜld1{@"s`fs& 5!8TԨ+u?T.w[bik߰5k^iD[T!U~AtP-=8,RBZFۦ[ <|}ϴ$!:>]L~s${ÿ؛r+ԂqECr%:luު0#C( /+d"&Bi2q'N%P:$цr'LKnқwC-=",Vۿ %HH,m"Mx QGLj9tO;!s[V5+ς;zMed q JDnboW+oCZIDu]Pª^7eI`_:JAAreݬ-Clu zPJ9 XZdK6Π"Ûae9>Nib!ivu>pAv`ϒ ؆B^Z P(Jr<Q`teOBFIiN Kfz$@ڜJJ]ɡZeTp$Fha#!DD@ k-yb@S یQ#/*>f)F|D( 4:GkmVE[pN˼_0!3l3\ ؼJlɑ߷(Ҳ%J-uG ",V*; ( (Fr #8%$*x~6 weϳ///[` A嘽^=4=&7G0aM_j'6R5FH @CFI.)W#BV\\IvU dwy8V7AB" d8`dQIfkƩyEb.{"20HݎZ!~ FE:J4 Զ8н^~à Pt yq6EP3lf"#fEގ3Q+u yY|dJ:f}P>5Q!u Bga6%E8F8#̓)3m=(&YYO;Na["mfɨOK}I$zvĀM$*9QslɰJ;Su-_,|ӑխ kpVh} u6S ݕ3SSBf+՞gB p"^. &ͱ-GD6x<Mn]0 qG9F߹( c}ں\P!z.(-/mdXp&1TyAX& gaŊ%W J&j%P#ny] >zO{ljiG~ȑ]KHD1m~*򓖋5#5'*m$ i;ۧ`;|*2srr<BBʎs+#!B-rI|[*QVG)rԎVCjMSNeŞ#䡭 Y+O"$BǙ!6gȬ3D#n$74)λv{o lR' [)Vy 9Pxy3X9C/_iҪ8XˌgZi,,aUFv~URF!%wѼӧRo.T[y=[pH6q?Gk瓼t0nJQc|FzChm땍H!fC9kB2mVy-XG[pFl[Os><([gg#25vaN),%m8c|*v>&HB#5Vk^6x;:k$i׮4`tDy>JჰA@̝9BJ.uz^ Bdy5NΎwuWC@ē5{vrhwnK%-T;rlb܌rLؑr)")!B^p5T񪬡ݭRD6ơ ]JzJ4\ WXB,Bry:ό(ς!eX 'an*^JB\`\z̑҂wA Tp Zz‘kD4=Mk{]1vwS~M䖋 wm(._J#8q0 o i?Fn7<74_7e3`aXt7ӷ/r_Is?]v˽K^1qD8ǯX8p .=X6nnQ%1C^*0KwǓ裻󅮸UMC>|֤n˘ [\RN\R7Hqj[AG_F,,'nA4F+GWZ%mâզҾ(fFgT6}ci$S&/0CZ+%WJJ=8o%2Bg[3@ڣGpMCSH9ٚk!p֬ o#mY; 70]嵇_Bwt!OYtZSMƬTZhI~ʍW'ErMFڜ s(0/qVRPFP, kKMPaF솓 R5wkd5 mG1XUj0rkg8lj0sPp#<2F$9vPe&DOh}LkgxL Ii# M񈬲\ %X\ =h匬dWf6ۡEZ2O0iUS"S j<5c)NH".Hc2*12ՠ[!]ܗIpIk]E1x$ĸe0e|z1»/ p&ԯVe0,Le$@!TaA!( `Z(Frн3C4WQ 2O?Kc| Q4/D+<>=pKaG/&/?j,"(=}{E٢ѯ;36fd!oۯ?^ 8ć>|a;f6G1;G>>|L6 r`/] cEX2!/fno'7cY lߖ ;5fXU~ m*(J!&I+Jߏ5Tz[ dӝb=G9x@/#s!o{wGba]W6r?Cm4~1wX ǂDzh(w'oE^(g%вxNNq^S[Ýh%umcIc֤N#`FfO.dÔ04mL{<`z &*Z lF2/ϿMa}{@~י .8`yݎX =&73 [(AdNTD׮RJyngS9kyNCԝad'+Bā= ZZp0gr祴kIΖZ8J^SN-~JKN~FJהҮ$=+2 ʺ¨դ;zb(A,%$FHS4KL6WgL:`ν 5*/˄d8S ,Whz X#*ai@^z80t< RN -o2BL(;֩Bc{J*4;A Uk{+> mujXcsu+Gy}0Db>f|!F/G ǂ*}-4i!? CP2MH|I cDwĺ||H!yQ)u J2fGwNSPUajD%?}ɽ!{v=, ;m"Z`tZ ]t%N]=zlQ~p; =L<?9XYqd!rSm71z&4h|_cu~U쏋7725XY`?2?̼h@=A{+,7XY:\yPc2ڃzY^05`%܇u1cCabIf3G(p#9'10,Ƀ7)fFԂN1[K_` * |:O ^[ ݀9O0G6aHpzl96VDqc u\Jں T>wo:Q x"d`C5*f JĩN6"{`R(_U0Upb/6?_1фck{Wk;AVu FYѽ-n  g]) v:aPih g`;o#'t15gڏQMnѥH+ƵtS8E>s~җqE݅lc!) _|"avIX9'Ka8xZ*1ף3%\{l>uP=%F,u8&Z5ѿONg[m5朊e ݗsJ04S1L0~f}nBZ踷(zOORٻ޸dW?JU}oك]lSFh "ɉ?Y!9h8dꯪ5hٖS#h 6V5ݜr_`.ie|W4E:Le?ߥQ~XmMk=dqNL)(^N*H~BWL̝Ff17jo.ta7l};'ZhE=,S߈?)ysV8>d)1myrvTʢ06LN4gaSKЩCcU'#epGp૎kr=o_SFYUyO=wv2LԎ^I)Ac^EF[*Ass(J8" ۻ:ʢ0pesn6h2[?m B 1=EJ3{`-6UYdo5`Lp\B99{Dgs"C[95ݖ,o{NhdrWNN͕{@k F$pR0$./Q\gCssMRdڠd3(:% tIB+QT3csɴh7tDOQWF' y'RkiBdңtk,y!8SNdإ0Y.)5T;6-4ldVVZTԹR[a;*ˢr-mSE}E炾Pgik i>' aTn?Pp\rS.|{ႩZ~Wo> B}ONo-u1gxSƐ71f{` =Տh.V|em\[HX9.b*U4~$e~x}UWk[bď4<=f, R皘F`I`,I0QK!٘\nK?}ϐ0*^]h0b4ƗG̅5rF8+cBl<Ij/L_{}áCy(ė#igeyO2!F Q@TbbpDp,C}4N>!W kYNhYXVR|*o* đl2s{hS\z~U~Ffg DύW1Ϲ^ vo^\&'UR6q1V+#CWCr%DhPXGf!s :M>[}r cԳ Z38 IqG.%വĵ+iRxx5\Wo/Q\4ҝl7X' cBWzg䌳kѦ *͎CT戬Pa7X\`/ZFKQ`KI\l, !vnU\v yͭquN9A\2R:)48P[ǝ\`8tJ!Z|owDUYu79Ihw 9b 4 =yF9#Ҟ RbA0h,׎Ip47bdBO3} {d[LtVwp5r2.Db@|vAe\q˸q(f]\q5y "qU śj&C˱rVzCR z<1.25QztrZs-G7}IXfp Ś ;չ6QPT n.Y+yxjOڣ@ʩ2cPܐ,W|8M~ ^Z~\Y .ЉZf[ImN`7*grfHͯwaKd2ؼ/{?2Lo*?Lzb~MYez+ I^7O}~ lWu8Sߴ_ 9e&f.SzV Ǔ[7wA-td|ږGºFa,&&Vb޽o)Z+)CBȏ"1W8α#+ ;Lj^1LzOBӿ[Lr9{cU2LĴҎM4GPL<:G +o"O/kH7@l O8h݉Mt| 6) aǟaêuQ-*7y3r;85ў MT+ е̕Eԗ_\JYZzWR_.ƺ/4\|hv'&+ڝkstG$ҿB{MZEndေ!o]U7vt+1ܮޗ֊{p(TL(0Sf|avU5 LbD _ \ˈD)0oK]jvKH3FHSZz}Wnw>]\MKFрLPUuJS^hOtpIKj_הt0X|2_xլYŢ `CD\뫝E:J {(,NQKKTKaoḕGg 0}`t6E_e;gs=!etaSO3ɠa;:@ V? xjv^ck5;甧)lJb2/Jy^*MTW}d3͚| ѪyՊ(M$;zn.uUJ Z͟Q/06 LJvBz֎UbB$AxL|.V>w d{h__1fn_qyբ&w{ZTH$ʻz.닒 "=zHV-}7|{x{AI=4( ` p 3Kvc@88+6챙2QV&D-MM~A@U;>ܺ8ee&gp}]V5 ˇ*҇tW|'[{ &mUs/B]豓4}蠳ܶv$&RǟKW7=Q d \rs_7}W EwOltOxCՖ|yx՛ܖxߌBE޻Ȫ^ј(t6>'\}MEVEYk&:d9]@vad :g 5g{]v`qHq{2\A ӪO R)5@.lo^F#ܰS tOaT~λn+}~,mҕV-+e>1+\[vZkȚ<`u[Zid~k*o \B/2,Brdyb3vMKMlHqw*zvm{79>c7RS7/3 Yvc׏GFm~ʫ)丿'j+>5x8!$2gk4vVs4\.uqG OMF/NJظ5X)v0GOCag`䴝lҌFz~V"z 9/16ǎks0+϶-@$7$&d,Ui#m4jdVG'B ,\?ce΅ӺHq1a(>4=Y:=}y`=y`}hq^h) ^2N.Oj0 w4Y-dE.o Qp7R62N.) e'trɔ`g+4jj`m7WlԃbRqu|-ɣ,n1ܦKӯݍ*۵ Z3Nf06W?w,g&fsk]zDZԇk*}̠eL^cUʐfޮ&fɠ#ҳV1fkvZY1gw/1%X;KE a(4QK!٘ȕ4b (ŦI$6hޠEU3PnY=1S  `穳uRSg=uʬO4)!$dm:ey,~* lWt:=EF+gtSa8'.6/E 3Ї)?.υ/3#|0ro[4WMP)oR1dDtx oPB+,?N] 7_*.MMm]Fߍ dHULƿV3|0ZXdJca2 ظ{LfH|_E]{W}đx  7x8ǗrT>MШ`$V 1{lƍC*b*ui4ϲAɹT6E[/+%0(z E$'C"5Fes|oѐ{@c" hJ$ơbRJPsp4XEU>̶Fi@:;h:CmiNd/G5 Cnr[$ГK 9֤<0ȪVlr/. 5ˊw)DO[3gwVЦwm_!}> C`yN_^`)Y$% KG3տvWS&*!H<_.Y0j.ϾnS)ۣKBz([X<v~ᯅcmLTFZ7gD,c\znVQ9S[2褛3)pX1L,-9Q 4 Eh:P:42K! CBkh؍YsYVHn\į;iٳIo^!i`ёRNDN_ 5r VY>G>0Z,%u%:RUdBN!=Y nKȄN3$kXXyv62ϰgQjFcƹU˼!ȻKa8ɧZ$\WO@!z,61G3|Pq7|X!wX.;妈tk˴OYxWcʐ^Gã(]}ezHe/q߅]kecuQc.j\UXNƾT8(9Nh1tpFа+$e!RN:m wfHWborj#G_(/ ٤֓6iNr^{K)K)oxK 'N&{M:Iy/]=rʀR՟;nܸ `?˛y̓vSq:L~LiA ܸU!Y>LxAs5rJNm5׍xLncꍫDsn؍6|$%$.Vgo`/"kQL禢9#d=bMȻ}MdꮪLHhctAnMpl&hb!Pɦ3#'Ƥh=5 7 ɻO+Hj E,k.[|!U]*LpmJj૳8*rZj:!l?uR5XRIYہjtg COcD-:ec6Cs #+05M-v9Q׭b-tf/۷ެW/֢^OLMѺJQ ZY+8u_xy:bBdfк^ro ^Q;D*0ͅ9]7B2CS$_ͭ?ۯl9ed>ߜm `d[oDT~2OI?qjc `]΁B+k\Py 1^b(XMWcE?|>.׳U7ظ]ھ㲧,4]Na6g.vJl[!!:)󒫹ֺ6^@hk:\8aLTl: ɻcyØhRٛ1A0V> LSop0 p26U5$3Ƥؙlݺ#EW1v,; $eKZ{i]T '8J{ 1E &oA8OyS,p^_> hJ3`Y+Hj=Ry7] V*/T.}sе <c&zKlAb+-%L: v^ B-<`4ϢaisqC46xx>e55*<{jQ7WHed6(Xi |4N{a7Ũֳo#Ja!&@p^A9c4T &XDL: NvcX >ʛٖwA+u@`xc=hleTs,F`b Qi)ywWmp_?^'À}oo$3JԌ? OxTI"ywrQa*VVy_9"9"R=3J3#onˇ#ϗ 937]{ܹ߹\c ,mrQ\x͟1A(T+NV} l#Fx?K܆ŃlTp=f'tne' vO+|{tRt"|:9ٓ8lo?1ps88_/a`]"筹< V;NHԒQ_5%F,Q0)ܞS ̩s)aMF e' $tRx+E>PJȆl'h5ܺXZf1(Q3Ւ&2ڗ>:2SitXgDų#/.BlHUyJ@w2e"rakDy%X E~b\lqe",%̤V|yZK B\#$y0RcNHK4Өnޒb)kPvbx t8{u$ou#Sx}yG:z6jx(.ZJ9֫;{pm)xyw>EʛF* QA**Œ~ Ũ$':GE` |:ATE9F(C?jC/2^sp_Ӝ/+ݮN6%LnLr 枑sD TdsDs&z5R K@i8U9t8ow!Ol3oɂghCe&ſ%u{v3OfKڧǟQ'x@o$ N' i믓}0=^4'BQEdfsTʫ$IEdGd CH-hD*tШ" ()RZUJX-,XXkRLFˉA3Lڝ2V_9cŪTOn/'qBKý'/Ul'8ՃW%OK$'KTSޗZ@+ԟ~z`4SKՀ+~X:?۟ʯ"~pjEp>},%cb{=lrDZSj_<2 w-a6^\ J׻g ZB7H= &uu]^_}^8KkcHW_}~ijWK8sdy %@x}E78˾e.1w<#2&.&OywˇנIcT응EHK]T%Xixkh%Uׇ6Y ^Kb9N3tϔC-mij5HDRqNJ”4ّnUP"Q9"!j9xcUب(BbjnQ .}^G& oZ *FN?*p4 AS,>zlEA[뀥$6SE&0;lApn1Qps] Fs <6HszCz)mɖΗ, ϝ2!6@l8ElҽamS>C} e!Y'7I@{UTgv՗ֿ ix5DSįxrՋsd Nc 0uUG4& \L$V0 wIa\KuȄN>P+D_ն: >J$T.?@U!l81_/ʑ(0*wÜkx{xw_}NSʹ`q% ->КOs`d/LpΑh?1YDBYJ-,M]_a~x^?0K}d#ZQ~sɹ#(޺p0 Ha FA(̘`U^&(.o۷Le&WpZ%\;;;sVq0 .kosyxǯWF?܉qHiþn-Po.lU bFZSAXpB0`Jb%Y;½IqiM9$,#3)AnRH+6lH9Wxms-1gʚ8_Ae1nC|YKޘؙcd_<ey'7qըFC/`++3++3+K)ef p*TKYRd`9Z+%UBC aڽc*)A?uGF=Vl܀Ty '03p1|/F_z<85ԧHŵ*6v~僇`:Vgao/DQhѿ(g &uYPY|1z׉.n-63΅:ԑ8B vMl:LRѼVAQ밚 %4vf)0G?\E&fO!˿?b4,E=(tb7{p/"ED½#>Nn yWb4*ȒƆJkZzAI)$v ьu"8ߠ?`Ln<[cGES{?1PIDh`cV9}A}]wV[D~ւjYMiillj!k̢4b P/V0lQ$t${]V2ڪ(!%z0kp$E<hIX[w2GP$5h+Q::;|!0tlj&ks!KHa"_^9I/Kӆ1Qa Ԟ$mÒN>E m#`s6wg)ݮoք*Y#r"zF(457bCm^'CdD:>hNK9&1zMh|ƌA|&n]r  JYy7VP p/xC ģ5m TY+>+qt\Pg0&/ٌR KB6 q̑d+ٕ}:> ,:ę~MN NRHT^l2uEVx7R(mw!wTݜg7g17\h96g=ج{O'-l&ՌQZnW$Wc.u}t`}݋1G=ͻݝpJ=C1&6/o4/h kBͭpV]tkhh 1lX̯0z#Pp_b!V@ܺn"B%4n=?frZcِi'Hr̝wihDP)O0@oF)iwݧ( o Oξ<{,LR+OvDQ]>{:TL?T*Y301Oq hbm#W]a'W:ϔݚW/G2T (o)9)ɚ\W,Ǯ?B+m 8-RJ[-7 >aZ{jJ*af؇xC=D3Ԯp 'r *<,n1x0#ĕfŒ8AN_&ԃ5? HKL9ܒq~PiS f }5```Ax\ |$xU8[D7d0nA\NF՟'17NORy;j-alG ) ຖxfh x1$^.2;]Ɣ#bD>gNhq? 僸WXsÎPlriHUE5ucM/&O<ďOXi_V8gDk~{?:znw"" D笙 __~|ët:+LJ81; < \C[j)_] N (NY|f㚗RҔ@\ӥlY|딇Vd15U G Zy4\@0 ZKYcy㚗X^^=䔩R7ٛ/ W6v/n>j`-.m}^rx51K{$(j,,1ZŐE${] vy3əa5n m%Ƽ6(mK֏ǨF%4ˢP5!;zsWжDү0n+K[AB8Th,}Yl&@Lf r;)_KT,LbvcD6^z$? s,C;mhE+<\=ڝJY-T, iE $y"X,Uo4kњIciyh{>n[fIc?Ui"IX8`EmCH㒴쁴6nV]Ϧ9 Yq`Jy'wxtIAKeMW .q4p1AD;ˌ+-J 4)ai).ٶH]R C;t܊3ٻlʯ3rʥHo߮ |01}"?+Fir1E[L8&T%7=8w]Ih,):\PlͥJW-*%5T쀅/VUzcEV"Ϛj j!:4]@=EG y1!Lz ۟OU"G -1T`)Xq ,_\)ȭj-x}lrr!odj$)X1A(-BfoNJ+LA^xMņ1DbԺF74'7$k8CvSTRYN5WJR;f,A}HaX Bպv#V.omAJJSگZ VW=ZSɘ  ;9. iRG o>oB'vRJ& u︥c4`m0a>ĄjhALHZV۴w1-AE_m zb U+ƗV1I۔)>@H"9vdƨN qUHXu}J\MJQI -v$\i"Z)[m=_ĒXRK/%V[. K)!>ؕF.%\JQJ"V:`VhMr܁Zmk*1R4|3[Da.5 JR+ STK֟:N 0X )B1z;yM-8:;x ) R)*w5lFEA`V#!@-Hj'ҰReP^ØD`"Q*%c8Lwӈ iƠgQ)9I@FA` \ 1#6'-sD ep(8K"iձE[H)l} Ba.eM;\1r|J$ c =D ? 4 'wʈ Z"N8SެS{fڗVLIT[0k\o~\1z_~/,e-j}|*J{Qk)"3Νm|Zpp_za K4O;la:_by^+!hOǷӖWCf(ä3.g?Xz&>29|~T8ڴ 2oE/Wv껇ep;;WXS7nUշLSE*cOfG-9}p\4a'ߏo+3'319׳QP/{Q5>#]g[IJQhX%'خ|bK?ɧ r ?ݻHדPҮD¤sCQ]d$KJHcUK6:fVv'<JƒVKki,Ys5ԱHx^#W^F[hڶS)!Tp~^>U!KD/Wdf࣏ي9: <%)6P{N˦{~~yW4[ ͝w*}?MKA-Q- |TU҆}]ŴT4T'drgRx|>X$rZޏc6>Y.,۹c)z_nSeZ3Aቲ CgU2Y$9,Iq{$|m=jϴ!\EstJ꤇X7`uAĺ1յu˞hukCC^S!{P::@jm&i}q7=6xvo4"f@Cʝ-|o̜<{lfTa%>B^x>?T}K:z/Yb8m!I> :B%GrDFtPouʠ16[ov_vA'|7.@G_ct*&j 5"-IMQk{jfݯAז*^vveURSRS0$ymct)&Ԅ׭Vw/s-bH@*5Cq>/-rn8:[,nCN: }3O/nPս^P8UﮟM;t鸺כt!8kqINKϝ(wd-y"'{-'gw",ߧo&O*Szlg U?ԓ![!/ 1kCYg<=ij Gm4FC's» TD'x$Y 6Pi@9hYYkju4oW2('d2FU:\&ޥ1PE΁ T4IN>/#̒:2ρCQ%)@$`:)X&~M Tz@n/XoUثB.On{W!QA8ĕ!hLjB\rXG(OZ;0.}bjb:%MKB*ܨ>Y@Dm20OTK u|*xEDG0l˷0-%Eh0 R[rKW"xOpfpw0d0HKWms2WnseX#]f#-jR;"( ?I}pS3³H4OHud:[&#E4o9SxMR8y1Iɴ Dh=!BrEfRv&H|kT ef2?6MSu/K=Zq{9oY'-|ԟ@%IDd Km(FD =6/)C$&EkCHKihhjˆ*6)6_A{͍ Zf_W7J4gLU #A鴷Ja^$:PVq P0 46MdD?"d!0JklM& 4@}5ɑ"T@5$@KS ЋI( Uwݠ `A)D/C#7F%pH rPpJr A8JK@9R9en<wGpD-PT?KA'9ӋsSkNcj7{u+Br5Vޣ[)<1I*)6!6j5̿6H)>y,%WYcoYq("BKZېTb*o%> y~+/4%vؼ"md16[xRʬ휄SpV '1FmD %)'=(Bsh\7ynuw;M.*rzmE¬1QӚV>i GM{k}\^\Yl"%$nm8|,z]~QY8]fJ^Mfv>bnQLAei1&U&}d5+ E{l0mVN[<Η {45 ti]hM,VZp5s ~Gf )s2iB26`OkuXZG GsMZؒeT0v6CRi'=/ʌ DrDWkrJƆ @ 2?,)էޙ9{` gjU eRox>/B5W%^̿Jr$nX73'~wWtӼܧw^^;ˆw9ht4E:M\,B{;zĂx5ˆw(1o뫯꾜ѝýڧTz VST IG$a+u1 ˒7$,|$D(2^)6G'ioQDlp?)=>)8g;%Z,/U %uBY+C]fL.͠ړ9j #t1  gb׺HQDw@VXJ`#T-N=G-"> V&ӛقi?oT踺SF6JƨFF(g%㤴Oҳ/@P$8F0$<4 O*w(:8Du>,4)≚KpE*rm1HcM>$YhmV,)s75 Al`\H DE`'R *D y;¦ɠr,Kv7L<K*zB"mng,W@ "H˕\ԔB&BKeG`T #;|q'I\ݜϣp #_5ekYn'DI0=VsMuq6eeW!{g6pwi._I8)wߝ]E2gzA1;D$w:JZD*̑;0kUV~|rp#Od#UT|R"U₉L={v |ݺpqvZϞ%zZ?޻.lWUYm{cJj*e7U!- >llAhW?ŗK"w!W -{rިaС#ݝ[. z,  42 HI i/qFGi4Ҁ ĂQx >W0zU{ N2N =JsTMkv.F\3j\.B. =t"w Ђ25^|+ΗUYJ7"!0A%ߒEL^*A9@x:X{WKvY>Cخ_>z2v`ЫUگG T*{P녆wQ8hd4cS$w!Wh~AE;mQ׌M 9}cL(=zP5byV'vvE5`K >`D֢o/`j^c y&"c^MiJ֚/з+$:d{+1=|>YRFHW"/PԒ$#AsK8,6T?"d!F8ZQ!:~+UI3N\^.^ 3ѩ 9U )K 5P#_x'.G ծ>xIkV௸x+4z:bc}9cW2i/Ѩ c-\fՔES%BcTQ#ٹ9®9 o'cx3q,%T9(=˝3QZD+s&t,xOp'vk4yu'%oLJ (oZS)&)APdudäT!1h G!jje g6 \RvqBIfYlvu~K7*VX]<~U4B.>d"&K$p)M!H=DV#YCZςmC>ΐ  6$$/ Q_VꊯJ\ޫ+WjiJh#Hi-UYrY@u͞pyvVv.Be'aA9[?}#|Zl'R(&1W?_^G| Cm"zZ/?<˻Үrz~vk?segl]%X?>L8>OpfJeӓRF)VڙO8ny$E;Xf̑k^^#O3G X`T/`z쯬Bz@L4y'oc22|nuƤݾ<(#){%h4Ld_:y`.IHC-iPBN?{Wȍ C/E>:Bx̋@ 1LZ}̄&YQEY~hKd @f"ԁM(O0.5|2gN$FHRAvٕN(L"r=ɕW)9W˺X[Z!qeBkkAN`8~4~"֛WP,) 0m8 zĤɎuEm̸ih*&)N8!TeR2PiJh6 zv=$7JiRה0B{jl؜@y5s9K3piz؇_wUov50t-y͞2#!.ԀG$bIWNØD"hg58KUJy5N9Z#FY-Bi-B03y>X C3jh&|[fV^†XpݧG8c ٻ fo?zf:܏]٣WV}ל,jj)TU(&5"6fZ.ԕ=*UϖY%[QM93%;G]ԩ8bUDSE;G]V-ί޷RT^ ,i-(zJkSĵڙ8.YL'ڣBuwPbdyn]Y ^)Ϭ0=Z"uRWg {Љ[QK=ZZ+),PWFG6/Bt}B=`^*#􊽌e׸ t*}5N.EзBlKi7Y޺q Ő,o̿Cт杣'# o*^w:S - d2O-RgV`e.!_) F4&TnjccޓW3@?\sЙ:kDNlsKƢsW>Y K[ʑ5muj*L)c^1?eeRƫ9 ?%\48_+ VWq~Nኼ wu-~ vRapv=JxjC2*Un t!q6>4׿.>XQ&("r:gU@#.A @YٛU`LQPTЅp/("֧ѣbnj8ƎpQQ=hr([i2n i&SIkC#> ,C(z _7zHgPK'HxefSa. a0ǎ 4%`jE%7ojكEaqf0\8JW ao2 ~ZATH{X+U|"P>\LBw5P@!j05貞}BF& *Dd7e?|8 }x>{u3rS-F}ebov>_n@0 PӦa.z;zZ <= f!Z٦9y4Ʋ&8_pGi GL X%[]"],kh$);΄OLFò4g# Z.ηl3% H+ *mh$TOj" bUMk`g4ܖd m$_m'hJZ>QS UGhQ]UBTqE^^j%v.gJ:\ߒ{T UZJQ“)k,0ZĥVZPmfGeUc^ne[`RX4|P‹u.]f;Pqې<a7gߟqc/colCBf˻*ɫ67'ͪ'_a[8|9?qXoк%,RpeRJ\c'x߉ie!k^>L>oqƻ37w)t]aL /ڎuok%{| Y4dIv -EY|v; ݻ % 4"atfMyt >v|F=|=jv{UqP['Z#ūܚKAE=ߺ_\rBhHCuT[W~1sJUީ9zx<<ATxucU uȉ^}BqKL9EiXi֜?ItB:,1b.Qo4rORx;yfR͚cAh|I8ƕAh*DZEV.ͦ4{IfzX^H]az\(EH+vyAMfR!.6 ;ŏP+A'tkS'%tx1mloUބi 0c_{4wUZ.eGXG{9ői;m$Ogdž.#- ׷cx}NX \yřJCL[gSU:‘U tbfC3ΑPqe/CŨ;2*JezEgz-n]h? 4ʔͺ h%y2fr;oCtu||w ) U'JҮ]W lxK?/ܽ+:i re՘NuunCW5P+Ċ͒JBeA2E^ B2|9j %/`8sBIC2-RJPKa !$WR uSY)DiZAjxԷ׀e6'!X.v0^saYe/hə,УUΌoHL±(lbWBT8^X|bQ őf#C3J]g`0`ɵ\ۆ\F(kMaXUo)5B)Ǫ$xD}i9LqOyvӺ™e= kja] ȧh0HrNqcw;G0TkMaz |3KH}_ˍx~ޟ!ZX79VSʟUU^8=Ad-rdNS TƽDHLlf~ꠝj.+c>^ߠއd#WEozJ-٬ OˊكۮBd_~cHj|΢zɚg'-eK1,t^7NQ-/w[ݑGJK}fb#rT]v,G`&w=+IMh Su'KiP␈+Ie5)-V3J<ICM,303bjsz!L q1+,(Ueb]Z),be #M3dR$)66#VJLaԑ RU-X*Ơ=% 0A(@S&Tn9@' ViBQ,eWIkC">erq0P߀}#(!:3C:@Q" 葑ɾF\2tUs iFA^j&`EsLS*-UȘ -Q *-g$[::2 >N1'hyC'qrj7뎢 0S o*V65"˷N]־@y ~B?Hsxtm>uȼ; )WI* ^Uy"J* g gz9_!2)vy4Aw006흗UYH1FRxY" mʨ/"##"u {rio"TFTMCȼ;ƌ,E;XUVR p PF2Dc<~?4mxr<(fB>w:t!6a<\M3Mu0{OG: s\ෲ1$'e:ʉt]{{&f'ChiE[GօdA٫2O5ʖ,߲M) V7f,<.\ًȭ.8MtFEI<%✏󰪓e!]>{a^y&u[CT(7q pdvoQ1z7Q\ز Gu A .5OHtw OB~Dojv7֨6d|Q8iy%i]ivA]v/,δ[.ڭ 9r#S5fQ FtQE#nK+"[r"LTv&l w_ҵ75XOw+P+l ꣿ&t=^-gPG/U)c`Vl+fu%8׻=ZNQ=iՓ1 *mⓙ296-++QkJf2P!k֓m%%bx2x5 f^Eo2qQ!ʧ!p78Gtztէե{Q}^=ʓȡjB\43/j^F\;OU@v :[nJJ I?e3.Pjhvq6NQ0+fXC|˸"DEN`e UU&ZZ O=/2t6E:#he O RVU:~XڌkW|2ky憞^lf #u?O5cRΡ}NǟƇJ)$:BfД[mezc>PANqd׳(R;/6M!AK[YlIfH]H WrxS:}Oui>J2̠ZV&~$wU @t9?(2-oDp@7OD,~I AֹB.+L|`Bì0<%dD(9hXeM5(K񉘵v'ܰ*Vz,IqfZزVRv0d9"rnO<lYϐϮ0 T7ݵAjg.+?V0[ tNXkt~!bQ+ְ[w_#Cx:`#;%2;n=oݩgA("=#O?R#J'Os<5uexW vV1HFw,!2 b C<BJ8h{-^tT;@w# - >fKMずc#/ԜPJp%U%8Ce (^K9~t=yk[l&7k*gL۝NA ~E@Z5fk!Qa8r -72P'7C*R$' ,1N,W>gYʰͣ%VI<{*i$Ä #v2<4 [8 n5Ñ#J]Ck~3N'%gr2!?id6mD}Q[84ڤֹ{$jH,E"S*tqYJFHI,|>LSAxW&'M>fFMfx.G0v?Խp>4=66+FQ;a'4SɈyޕQZX{ L kt84VݖEs puWPJ0#saKsw'j/:hbLg2Ŝ_w B*x,d( d4X.'fi'JdLLԃgwI>pL\>}&+)y&3I d>7|h&ǯ¢2~LL}AnMB??5̀/B3 mUAZ+^G~I5Gӏe^{, @Y92e~>j= _>Ƿg:V,l 'v{볺l=|.ƾVFĐS%"tI]R.]RS:? )h}EyF6E#I0i?hڨBQZJ !H ]{W ZͭxcKC:(FE0/Q|Q6:\2,z&2,13!u)\뵗`YxPwZ:|8WtsgsqeƲ\BEK Қ 2G*f{1&/&ui:NxXxS329T I .'f$9ǕH%΋S*3إKnx%)@ %|=g][^vt,U+<֯#YNo7f+^OmяG0'x&6@dz?t6_qb4)dz(Ϳ_PߙyG`mO'|x'P"q,E0ytcKgKf{iăc IGU\}Je@ڻ*A ~lbUcz)IHoxLnTFΤ|% 6b\86'ŰGFuS.##dM plaNex.Jǝjq8SBҔ~K+Qzyň\3Z!ʇc%c~;P7ĊƋ|Cܹ/{c '-l$gqHKQw5@}w 1LՆ Kqӵ^638CQw` YD㫾8#%,R -RD?:$DJ)rMRMGFJIYgd=X9Omd`6Y߱ iVKDs;.EWnQ%gm DGpDx 7E#$Ð`ԓXz]HQLlQn}o *Pfm" 9a0[Q0ÂkTO.Z 80$0;H J[ St]?i4D`L^F8[ |OKYruHMXDôJH$,/'>E!0 楸?OYmLSd!*,MSA)< !mH2DHǼVd2{S[q 7` 'a$eƘO_F]u*5d>LIf L'UVHbͤ%XDcAc Շ5+; Ͻ4KX|ؚ-ilvb_?_>v)KS<p}9:0qŭs CB+B}%XᕆW@!.Av@@ ` ܠI 9_U; ^J^:@)XTL\TK1*2]ʜ7s[c> iXlS5ĉq(F c RK=LXꌲ Ybǰ.B:R4e6icmrF$X # $B!F(ך[i.H]rTjH HR=j7)'r0|'N€c5̔pܷߺ&M92&ILZkB@&f%bQ(h {qΝirJ5(UeJܵ(,J?]hP %ĂX3Da1ZXyk 5yhr_Aqs!p]3]Uɐ`aDTVU)"&Jؘ 1vSlT lB+JՊȂ|XPXamĂ)-$vj+`!)F RUÊkJ25*a!*0 8*c p?L*8Vh]|/2R8r^#qq%77h4>7=يp @} sW6Z4ߠ[N;/f(n)5%4+R&m@1`u4.nKM! 8vW7B\(7' *ưt%(Xh`Wyj7[R-aM2qKVjhUY,zE/;҂~u0RY68U"~ &ОoP\FsJܸT">;FT}TSZBk]Ea?- BlLV5E/v|9Ӛ2ټZٯ%VC9")J6/bvp $L#"aVP$ TE´).ڷ\`B\A॔RvɄ~ RKRrS 28 K:~{XDKe谏STԳ Z=[*AMs[T+ AdqXG0JMʢ1aMMud&Mq,x T~f,b>gв sA N4QTt1^:ٳOo3 urz~ V!gz|,WHkDjb'Vv1=DƹAV]Ĕdw4ods˽*v?'?V? 1 @J,;hɗqqdxct9',]%+d=%~wa<;?Mhq?/_2{7uh~F x2[+1nmRnmh[ۡ/-;7ͭ3vRF?oM끏1i9z94BJYVt ,&;7Q/W'8|"tJS{9: -x^wxRG[` mK5nKnmn ة"]~Bc/}P4N]70@D G#Fgy[9|iD0=&i1Z{j%nń0i>Adgތ>O59׌ ֞TD\nD!>,߱, 'fY Uaυ nj#?}J]]]&],Y?'Ȁ:(oC~ , \Nd1 r/7;~eg@E>Ft{07`ޝ1@Yfϫ3KI7mٔ;[@r(x ]xZ5픜z6XzR5~&+ϒ k1~2㏟,cy0]`K]6/ ܆*)F S qmCPf1ř&"G}[Aes"gy!_P KY/ ; "-J~#1^}FPgĹ^8RjkhzET-{?)/MĘ(jE*D }Y,.a!tySUF9B bxAm+,٤7&/P`NzY HܛvSqrR;{M*dͨX {+Y7/Z;/[Ȯ[=<_3LcUpx%hsW+gdp.Q}_l4?|;(P5 OIB]ҳٖԓmld r!!}bCᄁ9S)A-=#UvC*9;TYnW0QҢ4JlF(I q#)ʤ SJi˩fvW ~;Sp-.CEVeAlpt] ^1-=-sM%"JpL,'}g(d`̂F[*b DN3uAxn}7>_=~І놛 ]U Z;aO7^iEsay32L|#p.-5kIvVI+[ެ-o-*)x{'׷޽'Oy={rIFM'qT+y4$:Ī2+Edb B2BIhJNhaF}'ou]Zũ6H2h춱3jhbQ8?scWM_ i-qe6(ƼU3مf8"T5u.K紞m7ڍñrmx4ɟ2œg@uVd:.ȧ*i|+|3w{%lP'۷[o{<&. qyɟN"qka||uŽ%ctr) Vj RK]TL6L嫟tF {J{Hsr~JR寄.Ð36d j/|VqfONx CNxȨ+ Gd2py,[|Iˉ!th|͝oz+UZ1a5$k%գ\w͵Dz 9ҢK:D<: hh*Ɩo:z<}L\~𶹅T[ ,n.-.?M?+1"oUM!Me4(s$ Ph~QL]yz$8{mϾxsǑ+F۵[ȯ$X$r zUƷ!R]n*\ 񣬢~(EʀaY-8}گv:ˆGu7O镍0@-0HR(mL<NV+VH!(jS3j c1~Zhz~oCϫϮdrsNF4~K4+ Dk V*4ZZ<olJV9id2$],?9. ϡRj1Z2{pC"bDKy* )YPN,gJZkR@4"G5h##> ?RK$kF^jp˜<Ņ8`G曗չ}5XDOlt@]? Kg铭r-;/@ޓC&<\]wc}x>z{$6YѰQ8m7JBޜr>Bɬj[ݸ7e*lz;]8Bw-<э>R L$-N bLN+Y;MFDke=]]G2ޥɸyN~\Vi\z1~rןOA[Jp}F e\OՊeh=:h(PtVD$,ucqiS5:s}4Uv!N Lsv)i5* U(刚 Q눏Bxẏ` *!VL$0¨jXMK'X{^}J5ϋSvZ#4Ud;!*DC Z{C$.h'|ɌÇ`Ί'y`~t́{sTR5{_ퟁtB. W.^O#Na/&RCݹOw^@&u'oWuj@OV inD#%ŷJ=>8t*QI߆aePʦ/CTI@VBD/%@I.k]8Յ{w+RQ64EycCb Yq6ĠE4A(IRC^Xgbt MrPqŽWB ;dw0Uijʠ\UYfqPEJm!FhPaG<->!#v}E ŀ 36gal p%SaTThNT!@Z/c4XKV8Lq2J)AnFC01RIދRB:a=!2xŌ* DKۏ4PkUZZcd|J[_= +~-/j̗CCW3#L u 7.zkaOox7뿹_VWo(p{'ц,J F)SjiJe¿.{ {Fd)'畞aU2',(rGhC+94DJ[2TUbdgG%4B PW*a g4̴%B5ߐ&#F?oIM7۔>N8:8>9Ot-OP _ś^%du1}fcۖ8u4S.K7uQƳ$`?wgݱXb++N je^YqZV.d(>_70̗OD3u'DO*NPلHpz6R~?(Vdڐ -+[aKb`zIb;^b8OFOgZWȵ?r?ӝRڹ\m/n/&N׿7K**߽{! !FIS=d6\}7-+2BvX"q1;!+$=htwƄ lMax T!{༸}\#6gۼw+87X&/>tvcWC>yndIj(MhwoX06<)΄iɠ=o&f{^1h {5l;*wpuշҺ[Ň-:o)fzk&?h,JH= 5Gxkf]'(2M&Uc]UśpKk&{֑OS]?mB譛PS'ɂi[pJԫB=T^pA1^}ۆF~NQH!NN]*/wR';EAIFAh!&&3!Z+%%HZWq}%ZXIEﴌߒFsɝ `5ذ?(]+68eM%wO-+ڜ<1D!J]q^b*%,"!-8x:ՇuW"VIBD+%\Hc:EQ{ L{͵!JKAIfd,{#+N],CIGs%@qgSRGsZNP!&49j沇I-0*h ch-8I.3(3(b8-3J5_\mo*\pH!R+#`ۦLR%kMz׌ʃď.Ҏ!Ы{V|bƢ)Y'>S f+trngP:RE4,cc+uKrY}^+T%=KB:PMa9[?G|\TcӚ3a .B_laEm+,3Kld^Kr,?\iupoEgAfyiE>s=$CV6, L3(z[:#пBV܃~*k6 ZyOjt0|3;HZ vu|&F*N_$U/WҗrH% _@ggJuʨۑ-'ig1HJ@eUB++W92 T>U1gʃa"0ey'E^(C$z"ČC9{yVńU6 EEZT-G;)ǕPH+ "@xFК+F @M686]yu EN"Vh5Vb@ &1ʖh uď5nwͰy 4vꏳ%%q ]РjR43lF jPH|{eZeH Ҡ9((9A2:mpRƂ1oe (dKՖ'鸈L9:D@!yxIYɏv|}UDCZ!w|Kb)7n$1ʜP$&xg؟OU/Toe7ڇe h왵՝ۋD,v//o./)_:/XuxVG:6wSp'i=zdE:"lEXZxeXƚfKM USt~7Kb~S0 +%U |xjG/l~g7bJzء9+t{5EĕӒ\DlJJ#n({+cjM"!łBSO7WWQ(u5YH!zO:SyK(=.MogX?$F =yE*Fhqs<8;-)"DXS4ѳԐ=D\s%].QK &|)~*]z4\y\i"gr=̻YHkQ>'}2Nd\'r~Y6Bl ÓLKmRɒT > y!b^::f+z{ rkf ҡ{ X%SWZ7UhYw!{v&eN]juFwOF+=qTI#y=^agstu9p4\8G5Mzjkl҄?ֵۄ pkɣ_fK̝M>Oߗ!kZqiDb?NpyK.T L{jLYxTֽKPcL%jFu*3s')퐥FNG QG.fv \kD=dvCjh!gtC2V64jXp ,sO>9  tvjmݯKc"BO,3iz\g)jI[f\}zq7KG[%iQsu-V,O3):G5Sp=N)H08N9&CК0ٟ-O՟o) (,G򴖛ٵR[ڐ\Ddx2 [)9Sz&^?V<\ֆ|"zL]۴.x %0vv)8c)ӄ3#Zܷ;䙚ma\Ğ^^_u\Sij Z`Z<7EjjcifA@0{xcxS8 >UVԤ[d#aj [` H!PfDS"hg:I|~~QB6UZL/!\0[v1@ZުIЭ ӲmFT+ҙڐ؎vlq`o(.^.p$薗Q;rStuW^06Tj0ѹҕz 2nqHVrM3qAۿ7;wod4Dr5ʱ`UPq+ﭫ{R5s; z+޵>q#/W=Jx?\OuwW~IJ(J!);ɖ>be9pu 􃟀*EUc.':c܃@q~6P3ƉC-cx6,璨4!OYeziG1~/[ h(Lkms/퇓v$k Ҍ(0qel4·3 ,2\,n-/d8O@ rJ fs}TbyҎ{&?\'EflY#qX3k5o5ThFf*+Us]F=Uph!)*0̴@ ,8S( -T%]0@9L e ҽ>$A qPҽZ+l;( *MktvJmb!(PaK 6@6\JD|/CAF!K8L@1.1u-f!v=x5ߣ*q3K >MUinFJD!aB [!T@ zJ-xUcCL6PW\]!C4Xp~RN4 rIR %K4 r}uA t#0/o]Y%Ub "9,4(Q!8! RY6YUȲ}FvSS*Ц,PIE5+UHaEI#wީ&pI`QnHfEZGvTVv^(q^ (rҍn%b#c]RZiPQ ȇ{%[d/B?7I Vh_|oB.ለVɭMyoH'w-=NqfuLV15LJͨۢq(|R۠8w8)߲'-&lݦBx x[lLz Aǎ!HW4 H7H#]'ؠWfE5PzH*pj`fhvn_zX`Rsg t ϕ.r[KY8=iO֭ǵ=$J^:hII_"tJ]iPQq:P%*০*gf~[FsX9-T+KSg ҹr` $%@ *f#ϻ2qE 2e3=#T0L|t%Q w[7 o)< bz43 j'%5*!sL-W>J-Hp .ܴ"b z8Jpi.ԉX\Wd6!ӰRΈTHΈԟ'wQ{KƼ ̈{A=!Q|(ڗz&@lV khn ڙs-Dze1es T;[&ړ]ru7*/3pތC6>3 1z 9AEY=OFMʹ9$~xtb̏\0y\"4/p{&ͱá/ƏF53 {2[N{ƭ1W˭':G0cj4>&-&CbLXʱԲlӝ3p7عB%$>$dO6 G'ɽ|g>JAoG$e2x@Ahיlu/g{yG5[GBƼo[OvYoen?VxݹݒDK8D';>ӹw*[HnZY8#_2yGt=oб C) ĚICc6#6AqJ+AFHmHմk fT *Otf6+sN5Y@Qz8SQs8ՒsH:J'B"@5AZ Ldp(M7i"4(]><mN݆bǹn[Oƒ3&`Kž>}ܙEȭwl?GvlirWvL6 FGgA, Ί*I|a|0L FI;rΓ8S'$Hn*?BaJvpa=_dP8hCκEQ pmjP bŠlx$[h3S%x83K,ǬcoF9sk0>V/2 곚^Gr+/Q9HW˱7'"'.8h,ޥ)9\I|%liv&N5/{;i=41z 9ň,h> w(}%VeJ/3hSj0Tpbn` P$G훶6ĆM.Az57jV; UͪG%evT"hR֣;e/-!V]򮝂 X1x-Lֶ jJZE: [Pxƪ -u:*g;I꘩ e-Db(%Cd,.Jζsζ_YBKB,Jȑj,JL xenTBE,!2*C5Z PBz6j 5@Sj n@{(F E)RQ(^2D [0Z#D Qr>@QpXdP8Br%b͑S*E@;Ts-S+eRi` qaY(2A$)AX)TSkdrwm;^U_;QL&EB;5'P`#VKA.8r fxƪ]<#O%+NQҖcyr>P(.!2·xTg$: kʊ{ʓ4Q.'jͽ1lqT?4-zMtKLLӇ~"\~?/6y8փܟPd-ּec,(z_Glޣ@|+n}X9bxk'ǩ>?yQIs*,c!= Q[Ϭ-/rM) a<>TsMTY}wrӻGEpnn 4ђbf,vC" |#hY{!l1ٙ$[0qQ0 %%nY@%H#=faN;z{k $ǫNy;C|0R5c \2@{Ŋ^%fηd* oAc-ے4"m;<+Ik;٦Cl /՟N y[8LEyc{W XE%C γW1<;Bb|1I(PEڌߙTu8voxN ZwV=EEqrkx~nK3C4)R&HbQ"SBH)x:\LfUw0 GJ]A.Rxz-̖z8Hł\E*9 Egόt4̌Sɂ.)pbq(=*[ e9Ǡ""R h)]`(۹p Ђ ̗0K%={ @)@dA]`MxR6r2S]>@VʘR9 F T)\R-4*D%@TȂޯB9}8JM Gr*1c4U>$ Qg52V9rM)g+5! ol9vfώCeu6Hf&W2 b!j~6̙PT܎81`Hhx-C^'?{WF /zd݇"!i6k{?턢hhj@CJ UY_feeUD&WYsWsm&^AuEDu:rf1mɤ`r`:HV-g,W"3L je/t|_g>;n'd:P$r9~h}Pfc VꬁP+m3t&<+AӉRpJɶş jMnE$#XjoA${ճ$xu8$HJ!V+t:ٚD ‡^ 0ެz.CJo!HtǦWR>#b4!AeRr4qY7}-&{6+ q':s),σ^Ze/~p, hjԹQvhXp ‘)l=)@j&+ J͢1=#8rC,ω%09exklA V*Η{}Y/}oc* Ze >*߿~4p%[|EyY(>kq "2DrV?#3/&lXi K|wſwfn~&;+d~x#UE)_{rwgLTb_}>gJ1 XH=pN$O_"FNNgz>hvO=Ź Q0Ӓ@G#8VhPVRIqt b;0g pS ٜE$ Pq K1 @)Zƅ@H8 `rm} QH3c~e&RúnAê˱F3)vB d\ Ez#Iȍ4 ,?`iYA pҩ IfvvPƩEǀ:(a% ,"ETeThA |`Uc-Uc|8hba'[R #eU{PS 5ުF9; ~qZ,&VTNg'X*F3ѲqhTr|:^z&H^diug~*~!4)D3]BOǓۃCc*'<w"m2mR-Bn{Ln@ ts[$}}h#os*HxN3&L1gqKMd+2˽2`}ΘR b<,"E5uMb9OЮ1:fҒOLzߚd'E>-4 y<$oClW:/j;$mYI5wI(j]gj x1e% BWt?+8=8=69 N{nT)NI{K"hmWB!E;Bd CS TZhrNO6N y\WmDwS\$9r ;33`R/jfp^1GFT.x "+XH(j5J璉KM<9YؾF4Zt'0 ;{ejPm^!sk[-r#i1AR9'AX&"a 9 ᜳ\ASnr1F+D<`i /ɕp(x| rɹBiF93Èr$հ%`'En@&'JDBqvɴSC%SZ. #mǸzՉmC~%umA)_vVA~5D 01f?bN͠|:qF3@Sq*$gSO'ITmIg~"YMI)mwNj96ũvM?&a]qJ3lg/20 .;fhϥY4[:BL5[`:}E\KZtML'![fzJҷʱ:_ec%m˷XIzce!|VqK td Q{I>yVֵbb)-g:!aA3j F HNTӵv>>}t]*ܳ|zpVUpO:GVL٨9G{ryG3SM!{ fqp}!g |hnI!nm{֛G/XG}KgI( e:֪Opj~7D6#Iy=FZgЮirio_wv1שw">0~H0Қm[`yFPcp Zpu\q9RH\` i *=jw<\þ8[O5|0(v;?JxڏxuVSh ̿|wiB˝778 3?!VwB[:j/zQ]d?Xۛ"WBvʊ[B VhuY׆ `QbF4'}<$ YßSLtW?U6.C:ń`,LhLL2m)O̩$Jea(oeQ*8aXxVzl'5\DINåa!ʼnG҅SqY:$ ۏhy8E*RV7T&u:|E@+ \\LklC:[d%"o=3MD]$ֆJ*xT xwe;:TsJ)LzqjV >bK)-K\DĴknSj-fG_<Ց>,mi'`%_`Gk qn7˿}X*ҸXocIyлظi 9&3RNkʾ8ŗʨYk^ZLvBB^))GnvkAv; Q[{_Dօr=Zb:J f(b)|/eAi9$3Θ6SY0%GJVQ*rA4W@ET2ÞV3Za!3Np$k=#4P ]JxE_omnV"-C#c:5}} 8IW-XhD^LYtTT?nh1y)>JG)hF$K 7uJB(Vކ9^-~`Kدʋ/9ZL*%ˤ{Z#^b'goFT 7DMj^vշM7wbQN8įw禅7۫ h)fZݡ)-̻_n^f0z*&h3yҸtRҪ6Rqd$åJE~}&E\ B{%&, R-rZS;D\".-M5+)pG1ߜ-~O@X bw}t5jڣ.AdDB%"Eݟ5YQ^/ryӗKBr =Fl# &oj9mb[oXJWVx\Vh0FZUaS -説PJ@5MR^ñR^ 8 /(CGVSWi­҉]Zj8V53UpfA峆jB)9Fh7W(n3>VL>iEuٳBku􄟮 X}`o:i&q0!f!8sc8+31#3 B:t8%c4MF@Eװ\*+fIBn$VpnFkO+nsSOe<6BD}?7GwgJŔjAlvbkc8v;n _&0qaIеV,P&X_%SrXʅ[wf7EL *#$iP>"UT[#[p+[b(\SXcU$f aC+f9Lk3K0%%y;`b30d_)>Nz,z!̟e,AO'xafEt] [!))UL.ڠKDFR% 0uѠ}V34"M)5(Ir eW8'6`J]MsW5;%QMI\Z+erē\BI*96jϽ̵ȃ i͙؎yîD%^G"~$yv'$*1 Ag9wmmJ~=;2^`A$s$kƖ5LfŭK[lEmljaV,~U,ɺ@ݥG%e8/&>`3p?f7Cw@qZf}nA'JA\ WNxʐ.zLkX:Hlz[ɧLG9+h1++N ++z ]0ˋv[E9x=ü`'ZЏ^$FTW>sjoE15@POqPD @F~VG+4T;h="p:cb."b%qh1E H{kvґh!tL`P0mEqtxyḦ́O|~ghuk:gKbbKSV .bB l,YBNL;J_޾ $19DH3Xt,qdr$DlSĽfȉ08%2|rD*3*UTY̰eVVI#= G U`Y0TxDžIE z?- P;ȢZ|QM_kcvFtheܷhbsԤiʌ`{(dPHΌ c%Ci- 7t S%, quu[6ːӃauf΢?]} `r_WP,6/oGXAD$$Űlw3b4\HL?=Sb򍪳Z`"%0 ,! 14Ƶ9)h wt1#01FWk`޾d$RȐkuw9_M$xO߉Rm1d\%}rFTCp"<HS UZ9^\CpO/=Qdž,h( G9#jJGwk: RT[ "tɻVP*i#'/0ZL:<ǟKl\l춣`:t0QoRy%t")^322Du-FkCLmB; \xB: !Ƭpg{x_e޼,f6}$ﲂ66= z_3b(~'>=)}Of,Io)U-mp-0[X*ԯ7԰013~ !y|Jθn.n 8wM##qy$`86dHv>x:߭AXF` N :__ki3y.?8b O#CD UTU)̊4Kl&yb2A!?ya/ 1mf~HWteʉ_FqgqkGo1LY|@JuTU/MYK(+~*w/V"\pUQo.Q'D2xuϑҬh!IOr[A#>V"FQuݮjMѨZ0 .zD L)H 5W2wdT(D,RfxV2[8BK]7qGKg˜^r_1C?=aYwANǛANӝ3p?M9U(ռ`ѷ6szDLwl+l g,<*NK5V['nlJ9W|\,#{j;B֚=;nt$Bꩶ^c*&vV ٿ;wy8jЩ/P\  =NGZoŤ.8|vg~&d)ɳʲW:2r"2)\3V#U0RӆpFsDɰ0Vh2U :T. 5q#BMfh0 Tgڭ~qHք|"ZCmiEkHc$q"81/t'f~syvLoG|y8Ј>_=xVevǴPwlF!փQAqx; L=Aы;g~2 Q(FHXxDmn &4ϴKY`|ۡs)Ix{Vʾ=v=4tBIϴe)$ԝ&seCBLtq8}1vY_!E%d)C*d[ sΏp܊i*arXD*5][OӵUxx^NZ-{Yw*3:ty;87;D9bMx50K~~ '/BwX)[.z(9HuS=81G z^!hadVHDSa(lw Yٓb c{8Ti1c!NaO#4grk56P#Ck绫C5+Ï'Wcf|`:4)td!sQ|z>, oW$9%r_WX}P>>sFq^3`v/\^t9]FڵoKz% RS mIxV*X(c\hޒ䫅Y JA90Q@ǫxVfzM0K2Z֓k /BODSMW3w.iWѾx][_ 4s kN}m(qu:S>Qiy剻6~kbl3¼~_u KAlMGC!lt(\j6)F4ajr 1RL D&tZ1% #}] B|>]c4$Xw}</`?'Bs6y濿5 +49.ۿŌ:Og.xՋ>l 4dew7PPOFh1ŐrVehy2NH%(/k[~+ ״=D{q5[Zsk&:0JjJx. U6h9BEiiڗ%"v7˵)9)}K$SKJZ%=Lf%B1(TY f Z8RcA7Sch~GJN\IaҐk:H PfxZv NB;ω}D8?nF{v]ΆK$20XP r4pc䦣>y )`SJ!rbRBYf4 Fr+-Dx uT|4\ #/Aq4dR-ݒ56E). ߔT^'FMv.UQ3.ScʦJijRS0&$R49DIỎ@9r>Nµ%8.`x;n`L~>Ͼ<xS149nFSyil1΄ř!LW3J p;2-d9J2PCH'`D9\26KK%-ϴ$5HgyKͤdiEGӹpE Z)%tukcxR*ǃpU(iг*1- r+pWՠAUcI+u hZѠ;Ÿ6}Ub L΄A{-ŸIpz>wӕ<f8UaB$"ly̒  D} F n %g$cSbhޟZ!2MCu jA@hs/r"R _XTtpm؇GJc  2B@ؓ*!&^(hS%e܇6Jy_|.ilnQUvnm>'ե9^ 暏cT/FOeldۦW3ۧz0߽7^{xfZ}>C.t⶝e>~}=f ×d P(ӕVX(}6$)NA$!%n(^DGx!t̋4ߦj뷬or{6fljM^knkqJ8hנ ni͋`24 9=Ɩ1P:ڃ'-4"vwZ~X eg|zkW7oF_QolC$uGNߑ(|Y DD½ǻs_.H39MXiNrtqTf4[u$'}u2s8fTM챧x5u;L#ڊ XН}j=P D tHV?-#,F ZuI²ϴY|&Q=j=Y96g¤\imPѝ^My.ftz?~ mnLN6<3ݢPPeIjiF!^@8G 潍-% +r b ]fgV3ہGa+Rd@r1%0(>Jx!Hi/P1Ꭲː*p>uÒ2Hڳ1_5 qtJ) %-EW6UnoGw%:K+I_ 8]k8"cKƠ=ri|,&HJ1,\6=jm0 I:b?L.SpCj՜C  1? =!qO!@MbcpTs zdmSֵ+o?Yֵ,vZ9(z?8.*ZZÝSrADcESF!t8`qsI<_ "3ICjnkh OEꦧxEƲvQRRY!m,.#k9*u9ƳxZr \3ۦ?$?q: B%?Vp+'h>A EQZV-ҊtbطYC{E!{((MRdog+󈜩vnē!P_XUr";pb5Yt2(dѓ%c3P t53B|S~k?^ݲ ^nu1:u(uN39#W]ݚ!W>D)9:q,SqwT>\0= ,6^=d^_Zi^O nEb~po+=mj3;OQ=&[ /36y(j5}U 9d*4@I'0VpL Yjh2S1p1 Y e׾$R2ȯF;-0keF-9¸<,Y%d{6G=x1j'0`C%`f SEZc~Cz/cAQ`\? 99[ůe)xt5_ޛi{ 4LQht{o zfFjT~楼wZgm%/z=_=&1f5SljY: <7M9{S>[.})F T6tؼ1%of0r/'kˉ;=2H3@l!ĥwInvxJL1S! %6>8GՓ|5M2Qljv#r\սf=y7>L)? TD4@  | "ynAe)ΖA=xGazVOAE bX9n?}Y(KT~]Ŀ*CG-)j/THmRByCX\vcSơϣ0u6 8UycDѩ@A> W^ ("JzZ9W/xPf'oVrX֐BȚcxlkC +2K)r.K=)Jͥ힂 Czs(*D>jь  ;(6FR8'LCCo (枧O GGC)UȠ < N7*hb7%3$0;?#ƶOMz)8>| $Phz[od5iՃ @]g@}X%zKl6$owfjV|Տ8dD0N R=z; MJRt<쭃O_cK!\4`ҧTSggTFgπ=rz@(0DE>) SpuIr+|P17YA }2S*k8BGB%d ISKݼ #ƛ.+aC5kܿƌ9}?#,H73-P=D9S_J̱hM&>RęzP)Uh\̫Nr:~ӈ,ta=9H1)CA>#Z&Ok)Z%"vWqڷIn!YNjih=LqjWJis':G뀍M#K'zG•$(Tz O^l5>Vtl\ӱ1@Li~6o sœD o* U7/f咚5]aJ_zTPʻN]V̜hbm^ҟ {lzMkI[&O|iO$q ,JV$kXݜ qw٠9Ax"(AN30wXKXgem#\DlD~0e2t&蠳$:R>޸Úq\0i*L w1h=UBR8q8![H;aBVZ'%,=}Y4Q/eGz0pO+`|>C뛆oGX @`۠{J%Ȅr 9.p?)uO[?ZMMnV˴ to[--~KM82My)ՆK`da`10mNwS+SpkfҸ#7 w}FI m}^׍d7&-+!gNZe[Ix F 5}V4ӿ1U}Ƨxo|Ƨx)n, !Z8@ @a@qRzb=# &Pfz_?_^.$'/&QףIDχfiGN^R,a+cJ$/.{!q/pD}?{W#qʀ/f^>Xj-H"ʃr3nYݜ2.m8/2"/@Xᣌaj@H%|iI(cA<Չ;Ң+}R] pM'w3M?) -pV&8UE 䎈`\=K]X ^ 5i۱5iSTvjvV;ΰ/Iu;X]cEI&9y. *KWV+Y48h  pt2n?c(` ae3`p4-^+f"{-3\/=SiBNy*-/}pd*XS[W8#q\KKf( *Ϡ*+,I> 1ķ 3Y #+eTJ"2%+`h\yE*tZ^Ohs؋S>f4V!悁&d\Ëh{zFT,G $EDSZ'(Q3?ں~OyŞ8Ϲ])hv=ەr\NL3.IGS馨n(xjsST|rYI=u;A%sɦ?ԕ,R뙦cSTz2%SInLMPF11^ |ӛl}&+|=g DD/.gY/>j=D !fb8uJ9aRcђl"yv|q ,%S2+x5'z 2rlH AUW C*˕Ry 0:$Pݸd풛ye<4- hJozK1#ž:b$deiIƕBlP8XDR^jN\9O8=kv,lK͟)5!U2%Wi(%08gejGVjf ;HmB]b*)7W;RCMb_3'77|1߄ Y,{ޮ+bĀVKw)Y<{Nמ5f?q@/ݝ_?@\MfGL~쵋g}T㟟 U =}vu:u!xm=ᱶ_5\'d!߸v)(Jjn4ʧ.ݺ t~#ǻ]n96{:4ջa!߸oSo`lZp)eK;8V˖liI 6/pu]~ROHb1~YJmio7҅+1L+o,uS7Jy mO<]_<Nи){~6N!7/7+{D=yz9JeP4+BQݹO/;(YvI8c=Q3_>14iEi**|!@88s"vJe73FdM B;кm9?})%FOoվ*vn-; .A_?++d)e wUӚ7 4 wݏ-\.oE+^VXgEyC]1Х,Y R{ p{~y C_i! ~'0Oc$,V^6Knv>g;: I-а!~gM} tWiϞ?s媼_N(--n5?'d|C5I,wJO.GB¿Mޭuovf^wm&>vwkÆ7ŇVvj?ǡ7mE9y"`ou<|xw ?I)T:]aˢB2VS9Đ-LΊB+ E8{QʴsԽ({[ۍS}έ16րԌl 13y~C@vK-@t1pt UP)2Teϥ j HkP4P.hFSRfW/30y(4g(~!=Ct]bL6BDotDTNR|nS8/t'>)>*O>f ڵ"lxp|=g{Li8՘Eq)"9xJwX!7C*TiEP.+;rT&"CC$CPhvx@ֲ?h<;}=uq*5P侗 Ѳdf$*|@A Xq%K.KĐMp=~/E@Pi*A.#GoҙaBp읫K>WOzbw;fR:0wއH"LتN9U[D˚sջFiqh`=ש(Fuz)1Gp2XpfWe1C* [ w%8^jtSAWzAT 5r7A Y, w }IdHE4Z\mI>(0v9)lsP9]Sk;#6K\;3>ۢ!̐4n=\ ΔR!k=Z ܨNQrU>)_p+crzMozome1.\ڄۄb&,QƘ"SGRA=;C{L+FL'?FX>=1:rO'E!3kl 2TKu[6rt@ejJ$jJ(w:weJu&MN'ЛKX>.[Gep0R> z.t*Mv1_ȭ&o˺hX; ͹n!'sغ&=XpX &C>+R|d^rkȼĦo\l:!K^rBIx"<8HOeՇߗ$|T9)V$LjdqzeKnS`TOj0eDƌEt ޽gtp*>W# \^h8 8 P-<->˯EBH#} 0vT1 DKѲ4e)TB(O^yWI[ILr뺬--`FFfor,ԵPrsJh|{n=OaHHm6vqzY#.V.cv+-Y#qVRU9!b2^S6W h`+Tz1Gq7cHg6-e[?n>ڣ7\.Q{1{w}#rv|;MKl%qc=Kd"rF%Qإ1Skujg=~RVbozkMJ:hR!V'AR pB9QN :dPJM%pV+RL;Y("\ KDŽw`#!dTRhHyiKaY$Kj΀|u`̡U!=qJrcZWJ`q2e 0fPd;0 $Wy@$nо+&>Ȧ~:mF fw"IKT3#ISi@jPL>=JY4 Kջ+*NI"3 ,Re8 TAyɩ#*B獚m3DT2Dk[1] yNK޳aO\fL7#G9_yxQFAI\HC@ 2"p(h#]G24ɔ9Lz43ٷ毞֮HDAaq|=e#e BfNpC>R qNd)Lw6S`q:8+)J  TUF{gx12Hn,>l`Q;E0J;jj0,c]|]j1My;`wӨ%Nh##٩f&% S-͞aNH J1g'h[™rbsD'(_N}QޯwN"S ocLtQm0N|+؛j v# ;L֊׳?NnઍT[Z ;۠Z0GRn г>şSmʄgǛO7a̝ʚ> :puwQsSVy [* ~i*6DhRiwLhB ilzW:OgV֓aeK,䍛 Bvqwk#л A Fv+V7w^zVư7nlJ8{71x2(1 Z=w^^ư7nm2ßud:amEtzOn6EQn{wx|Pt2ۛB킑!Jes3@ w5ʉET hفO5Zqfe-5zg| P^ [P,8::CDu!T |ԏj'ݏ*1E5Dk-]]'-E kVvYxB6PQ溻<_6x m~6سMb9QӍR8@Z -**A*1QI?l<ЮK(fQO$j=M(V(u@WAx3GJr^B%:M͸.QX--jk.nL]sLPj!?։^BZbd\J%#7 xul.f*DWM7PNm0㻝Z'(i6BP[c0cL FG!Y;6DVV`QN6s.EkƎrQTf&rPJ0Nj|-ڔʭiHVĖiJ%[f; =ZG#m/Rb#8{ ݾƜQt3щxf2@asPr\ S`lrSp@ Z+Fm,ޔvW<()bdK#&D}2ސ͕D߇_]J·TKUevSvKG:3fq}<9 Qj^v4'N `o><HJC(W)oLZg![@_rUL9C5(腲(O~88 I63͏P Slnɑc›}Vt6({hQl艇28BkƏtm9QĬK(GR,TeEτU[H]y$B`fDR^WOnIF%|Ȝ@$8 b'QvF5 ]p跳', DZ.ZhQ3_ w( ?7Jd([x9Gh [^S'`#M5 'ՃPM1&d!AS^%\՛jŸW&բ=n  8PEXu NɴNqpL)MtælMمhpeGKްNQ} ?~2eu^F9n.,iuìհnU놏-\q񨍒 B8jÞr+zX/>,iĚCCiPXmKKظ A%t1 ?=mxNbҶ|!᪯f G%5vX.uA2v}oW7*,z-98oϧM⣕}>m6~͞\>6C+7?e.G~n"K_>6 מZ9QJj:go/,,"IX`38 (PPĨ BmU2LZZszȃ O$ -zR=XTmP͙hQ;谅)RaGa”YiC#ޭU[eV*#8`Xct$YgeU6r4lEIAYccTF+E%0dXjndRqR1MiE7jI&_]-o*׿Vu=WϬ~r~~..VIbl7 YOvש{Dʛ#J͋%h5/=ѼYL&ۯշ6cyI1e~etOK^Ɂc wW©;޹HW9<4=n/5VrzWjAi8_kPzzvn'c&nQ-=gRG(zpx~W7Z/9uҎZUR=R0E-a#銣Nt8Uv sO:?x$h5 ˎŹn%G71] EyO[".OPYV-W>`UO~8Fl}GM~UCi#]&]]ޞ(]ߜkbjjtKb$IbYA5:o+VX=xٶ.,V9@5*㼁xLAWF0_qμe򐶞eNӛEcQbF-?h/kQAɛӴ37]{[~Tx V KwG{߉TF[TjVeR_]VKª}miv˟_/΃k=:$Ўw9$+9ԅl酐]K-2cԬW`Q8MVogըs8[pn+Oɔ>e:LVn1+n{Kv# /3H#}gT}쯹ݐ*K4?GJ~6΋u[)Z) (cuP]Xվ 9j_× DF1yԻ:bŊZ=KځUu;hVKBfQ%e6p<)?o C,[ 4(֜muw2)Bm|ʝpu`%gQc<9io*Fud8%LDe-~X5AjTM0xʤh J@}uϑ5r 䴬SR ie0M@[e]NI:B}5 TNTY)#N0ΣV d^a,M$#i!)SddDAH2DkT[+\ C֨ҕ5 2d0T]a 9|-X<UNX )q~Kްy⦈yI8}T=Zz?q{'n07 aHn/"t!O:Zq[ OH6T1%yEaWj=wv8>˘jkk ^ii;(R)a|īl_Ԯ yH)Xqh ͏bնU5n;vUKQIo&ir.yÎ0yhxGˡ7|9Es/7JѨ]V}|^_b$GnyĆ?{VZd5Nv:t.o_?EɉᑥDyH*X,Zf1-3B T1Q^!]33BՃ'xB v˨N8q8ӵg S Kk4 (k"=?1`ZOk@\3$ٕcZi4_0*%N,=h:̿_Ga\O&Č(md`ΥhZSh h󆃁7 ̇O^z`KBq>hd!MM7k]^*PD+6e㕷lÖ -7+j1 c".9i\gPj.c9xC ᑱ֥XM &?!IL38m،׼Zg"@PD3c"ĝLuLuL`|^7*7rzB) j48N þ -[{aRN;%w%$VvCZ V=IkK/Pk4;Vu`Y;eRHO[}0)))XA*l4z˽ Wt@$-Qk Ь836RwȏDOR)EcLk1%jCvw2 LQ s- ƾѢS 8L&4@ԖsزH?M(R\K8kw"#jn"YҰ0\t41c@ȶ]e K.l$zyC!V0Aeqȷ@PJgdkԶk-"3jӆnFbN1kkPGrA8y([6TAe9,r"f<|+'QZ&8G%)oGXZ%j2>XZ-5aYc 6& d>X{QAtNشDGy1U{ YQ+PǚÎEXjZ-Ij{i_%. RZ5I]J|i ˳\t.if:Iv MN#}  ׼aMCV1g= '+sԒc]\fy\n}lme/2o ҒvǑRf\9De XGF|Up+ kttj~ 8NS_:_.?~|?:"9BhDmLNS+c ^9y#Y>~^-N3"l p]E>mdݖ{xZrmq 7˨! R_R=+N«wuɄL%7UDռa]2DxYA`2$%t*^ M{7l4kްFx=ҜO]N.O] 'Ўl5l!tkA1ccVVkASblSS*U[=+lShxӟ~M^ou 'iǻtwW),,zvMnt$^2PIgO. I(ڽ!Ww"*=?}7*NDݣ0Uj.)༼94-lc?7Zm@fx;VŎX9J z\^]<³)X4,otvOK(kް yTDJhٌSKem2Ci΀wqzp s1c<73:&76RxC+۲|5^Cf 5reaՍRJ/<[rr^gelMZ*OB`E4 HUrY.{5k`k&_5)XJԗYa1+fMAi㾐ùu(g\0#PK2괴Rа0`A3h#jA#*2CCAOF?ɍ-Q^.s`ARFx\cђ qςAi@5Q -!6i)a33lFIA 8 GSlC"%>:k-)GFֶ)c'. jK#a8`AEt DH!1N[P(OsKe%jl|e,eR+ݡtJ׍ҒRK9oQ+GNrנ pyu!f2؋ۛ}%"h}6 HD-z P<(N"m,`]Q- =+ްfmIeN0XBRi 1r^jq D-}ڌ`2n6ueQZfLN`j:n1SⷜCע~ IQ`|y60%a8R5-F$kUL2AcGw)C's1roaC7[̇A'i"큻m XP<6D25oh!<.uǃ2HS *:$H(RW -MEacg_?鼧>K@6o`xC -XXc@Z:Rw9 cic]XUN?d"?F=={^7X;o@;0M`UzcfUZ~+zű)sCfL-1bn-b\!]:9PK}$vΙ;An⁸u}aMZ,rڨMIu6ĝItvQ cNWf/Z+^ӂQB|u7U]߶ ;$*;W8}{րx9w7E8YGְ~e?>Mf` SUo. Vq0)2d$G1ف%pR0yZTORj-TB<=5Ԩwsnݪ*^]\( lF+n1P[;+\+^ej‚㪇k0ZvT>>ɏ)ߝi'/b]@ɺ[\؝_⨷jx(0F{P/`=rك;-aZ╄*-ȯuo:5L(̤U!UAlqhT^ͺ7UrWrm,Ӵ`oQF @dMTl9:B$Ѿp iw~7G\_=D1 T|?.Q ͙fO.% ]^ + :vY.80ѻ5)]dOk رk?*V|3F{ʞ+N \tVTm}ѥ6'H'5G, {W}qh _8δgML1q1BzVI@sJ3Owq8ukHO{O[[&yv"^YNțhF"mZR'; @1,:W.>pA?z,EYbTl *2AtS+mnT;)J6Kay#Jf\l Dۮ!r6jJ46:&넍"fUtւo_E1b.G5rwaӘ%wfj?ܩc|{_?G+K~FKT/?KooWWԟk;ݧ?I =+@-UYOO8# >Hբ>]_wuz_g es}et?m?X{x(rPu]\֫h6./:\Ph( :$ K^߆a@ Dl( 6y#mD (ät-29%s`& yb$KK.QvZNjTޢu/P%[ވe'KvQwd} qˋMېlBHjM2Pj=&0`ƮMm.&U1x˗rC66"~6#!5v&u9KѦC:DoڼԢK@&}<̉+ʓ1564t\SHMfGӾa=\ V[JL iD W2R@Yy-ot1pvyFlsd]<6)Kmt/f!^Q ];A0y#0ˎ=]^GxJyˀVoL;A,\fL%(f>~ަKXvS&2P'z8GV4r{ȡ:ކuA4's &(&٣հۇ]k>LbtkYHgr't* 7bqe$2W\pieU@4jr q.Fc* ,ܕеz6!7mePY WZ@`)Yh 6-w:ҤzY1(0ʹ WDOIXz^ T8,vڌ$r 7ʕuzp,t:yJurxj&N>vi {!ب,GawsŅ]v,@x2?̐hhɇODY]%<>}K=7 S@[tcvPJ&ygW[Q` uWRU`sK#i}aJ0učP+ l^ˌ׫6cRSB3T*JdL&HJKBG69H{g ZMhy$Պ35WjWjEm.:bTYiȎmЊ:w'Ln]^e zFB#ݹzBR^o@VieLױzoeQ'z p̚1?/o]=gP) ;0 @ ܏;z)vgjz,yXݧxU?yRr1jV>A%t5FXσ?ƨsC)Z!)QQOKp/XX@˛[ʻ]-r!U4ϐ%KpNBvH=%ɒOIX<6K&KT\ǺWW0J#(}4/3ETsH[(}JRgQGiYE}BKa@PƒBAǞ&f!і#Ŗ1oQ )R[UPl6+/%CBBl &Ϭ NHNZI򂒍EQ Ra6A'Bl٥`bZKRc>~1 J;S\VK=$cL*JzT6GZN j-h\ɟ{.XchQz\  O3܁{&tå bLRQYمc1 vZaM(OrS'‰  W[IxrrӜ!pQTZ -@{M,L?r}1ia?VzgT$(BIvI^rtUիAd Ᲊ8B,=VvbqJ rӃW {M%k V:S?HjQqyCV8yE#D,]/Y,#~Q"̊nY^Yb!6} "|As' 4m*V &0Ebo3Vs_͘G@Yo9~ cYkﵦ_sB&lgU&밒XD7IkchKWgeLƊ` P (ILvxt#&f}^nu16BXMIE®o56>9uUk'-k5c;Œ`@/4))9G}+RJ:uُY}.@I}$ƉQGsJt\u~@2e[}x{h6l}G} *B1gY7P " p^ޝ%7J/DJa=6IѲ|4Ʈg$u&0_0Qf_.-ErI50r'/w(;ԟFj^s;B k=Y^ :>Kv 7b1_}jzن^p)_ D| DYA{^FA< ?8tO+B+`՚Кݳs{je.qձ̄|YYxb~e o# PJ^wx ]/jhF)[LVw/Ġ‰x!7߮K,_~pRDvS"R; Ag .NdN>#b4du9=7ߣz W"jDؤp%Js[0}N^"fP쬺:ߓi̹~w)\~u}sv&O;1Z%?2D)´B#kMl'jPnԬSlY1%2lKg1<e[']ܠ9 ].8)WFQ83Se:d!,xQ]G: θ^Ѩhvc!qq-fzL|2zT=L#zu„5 e2mFJ tyJ+f}LwX/;YqMV}ehp}v5Ff,fO/s.eĘMS?c"?s3kBw@君JND1$1VGL`:LsOt]JkSicz]੃u$;oߕڍR8+G5&ϟo{O?`!NtT=hށQP+kzFSn֬%t>`Hf1,*e7fB6KV+Jl,T#7e5a6ļyIh^eCMmܡCڹNa ^,ݏq:kɅU?Wč4Oe bu9~OW(|-J])EB6R-x6,%$͙*kPXf͗&[my(03ggfI}'Ш_n]rcapGW㸄#>[l,@!IkLt5RS6 dIQDh <=,Q1WiT S1f֧dZbg'/D:Qp4)ZmAyC|{!./ ϑ.:m@Țr]t<}Lp{J!RE~_2<M^[&OBE+NI'\b]܏;z)e*0=MĻ(F:ү`{r2f =#K{2Y (+fd2cc{Qv1Lk4#8 ˲/bEDK1&b_£RLnyPl$lׄ\,Ű%SX/:_/եWZ'8ZE]ٌuGNSP+-K矃1Z1/~^نp.(T~ukm~O9 yJ;+o6.w_X;nm/J;f]E_qu XɡNl~!\:yإ|ߢf'>pJx)Sה)IEGk:~|o15 O8x9{cd]|įbaвs6 Ǡv>Hdh+s2emqO$#O9!3l-/xYme-#he^ÃՔ,*e׆j|ɸ.i1*:dCmk1XE'ǂhΩ睊iIJOwu{X|U1[ `44A f5ALmIlIK|U[NvkkgnRW)mڲN/d_%J"%P )Q>d&6 6h4n-9cF(\Kڝ'C]Vf+ SXEN#CwrY~W ۤh0fc|wB{Zi${@)QN%Jq8)iO Ij\)LoYiIrM䣏9b1(HyX3noCÈMUM *5cAQ"\H8"UqJwRgaZBʎzٔݲ棢h]W^-=ڤy'zD¢ңM{Ff0Z=Z[W_ i #\T"=u ?9G%o+%E wsgQv ǫ:?iIKtBDtJO~{I˱[@[!'y_GB>e5|\FBs1;zJ!?-G?[3{eR~2rag\6yYc2gs$FM&o4Ow+нA_ n[3] P|TW rUefB˞Rw;'\sQI=TWwBu>eeb._A;ؕxsWDW0ď3ޏ6֋T3X%8ԉ/uAi q+GyJ9Ňc,%}6bjJsv*Ӹ/?VT}\.GC#իڡJR9`x>pةCtN!a:gRF$m%':gnybDcvIV͋ ۧ`8{ZEoggE؈pxU7ZS+Z?&jG^-V4ھo'H+*˓q+@ } YO  _\zk8)T!X4&(chs~ kEb00ߤ=@ ]h)f$AÁpDz G ~xCH<]~~4sgb}4_ 0W3nVY~9!%7$h!gGgry;@n?ɂ㻉UuREOS-[N\|=,n0,~U dyڵgo'A򒭀jX;7qn ҆ec/yrn݋"A:L H1ı W qvm5GKG32Ƙ19H74&%C)NdXK->xi㐩hdY2O_2yfRpuͼ+j:^~A\{H}ԌvGe>^9*QPKE/3E"b([7*vB.ܥhBU8-2wYյDJ7_r2cL\0q6C*_AwU/Qf] k/"\j苔Wդ~PE ɽ}G_}*nI(IHS* H"61j8E:+jd"${(=㪐2M8:Q,1ؤQ 6fHa5ښ(M%,f*D"ih aM%我2ees;$1[O&BPM1+C10""3(6QT#M,j$H$ RLH#cưS+SxL(B.I{|ŒoP٩&U%jU J(Ob(5"BeĔIZ)ۯӄy.}?ݹ*HezJ gavlx%uM%ϸ+#Dj IYݠ84<yKkP!:FEj&#51Toæ^a !y VXs&n_7;8dS' E!X%xO3Dj]e)֥>G2U@\ 0K%%e)VbٙWm;4H r0j6<͜࿟JU$vÿCWO7?1ջ̀ '~MwγK4_f~Neue]KK lb ntݧ{ɳM 3' ʵ'W%7+E|! U4H̷o݊3[(>: ufyZ&4W *ǹgT[(>:휫E̺nMh7;:yˑz̟r1:KYQ4mtl#f-{cLCiTZ]{`QIw:0]aΘ}bwpWrys6n'|/TM=+_Po=l6t[N?Iy: i ƹ&o)luXNW*TMl Op:'ߤTi^w"yb0M` DGn+玧Ϗfb AO ZQlh89E# ͑'Gs /,|>:nmyfyhCSw[iX . /?^ lXq0ܕL1U͟r8R2f˪|Ł7)b=5ZsD(#W?-'hu=@ \ҞbHSI|1GF_-k3=ݔrgYzr! =nļ*i 8duBJc)4Ib&DF${1u 9+˙Ԃ@0N ͰprC g1+0KoS"Dc2GJL`Cq$EĥenנfhC剡X0uLJw 8M|i2+jxjJmciP&rE.aH mYp+mY=`}qmEXi!J%ԹO<`bkBIpm"Kꂥ6* #uՒOjMGHNJ}o q ߛ{il*P -7@m8t9FQEw!cN'XI{R&R F_'ncc35v~ 3>Cjvo|`)Ǭ7drυ S#9\'7Ѕmٶ+QaoU!?ӞxaٻqcWTz9P*?̦RgӤ@kVId~dQJ@R֔l F%B+%jSi&ijywUw7_}g_T-WLuwFYߍ:(u VV FO4SEp ,P9ڷ|ow{%cB+w`k -"U{F5!5Q@M^Ġt=hz&:CWX>n}M^sKJzQ{z5Z ˪tswViv>'!kň*y:>lr9=R#|Kb|AZ?&dz,/L<(-O׾̧` >o{_LgnK A;ַe޺Ӊ?h{8-̫ 4Rk rSf|f| { YoS~NYCU4D?GZ\gnN;X]1ig-<Ӻ5!o\EttR=DN3ZP |T'u9WL[ Oքq)Sz5|doE,}9=d6?@ 55͓݌7.Y{KxyJNJw.]uv;7/sg7&WS/OkTolS)K6\Jn{,W)Ug0> &r *Mit3QoL{Oܮr>? 3=Ν7dQ"K#'HTO!p,]э%R}2_?D(XF c (bń6%Q;VG,%qmos `wY *_?mF?ڊԏ_9s***{]qŅס M`}u9MtYj1jubj#ڮH|tM%.~=%V"SMVޡD+uBBAù >\|ro 2–S<6pPQ[&Q<~#F4WMA!bDSA2:GJ1-Dx4VFç|Z{** n{]ae]BonׂUկywnMO^)O`Հh,8kۂb~xP۫>m(*G3m&IwAq6Md]Ok(iBvc5˴%"6(qպ2ID4 $fkIk)\%X) .H8vxzTLW lQ"N,:di/{5Fnp–%W:|(riܚG] ˆ8ˠqp14.Y` /'yźbSȌmda<*l2wrߝ=M FL~'Y.!>^ }EP"M 鹱FefG0DOe )Ѿw8+k3ӘF(7LH0mƙD1O%OƹӋ26]O mcE^/o/b+(DtGiݐ{?ox[vrd܋Coy; H.6\VX$ZR%+e-QH[cnjD QOJhW1ѵ`w6[S|cMélꎾ WGC, iv]oR%¤:W_qިT+/sN Ɇ0]]jMf3;Ӏ\X`emз=r;}[lTaK6f0) 2Dg, I!!KELK OW;"- KFTLg4Sy{F͑TA*Nੈ3 GXxADDD4?#l8/8 7 W= AVL#|ݲ*Yu{,'Ni\;-bߛ,q<`KGaVjGFmh4ۮ?H<5@RR<uw΍p% R2:N6)ad`qBc ҆bpSSBYm 5kup 數0n.&@-o`RDGZOͧ&7[OJg'D|ef8zq1}\j/ۛLUR{kFO^ eoL{OG>2_nIœ]-gsrwW׬5L8F0"LO3JaȦ"I?98w%3.{rs'\YsUPx0>Kʂ/ti),Wl,58.A"y`VD< r'z}tkI @0+;fdN)4iiG,A cLX& ?Q.e+YomG#pְC8D칻;|am~.{yۃ 1+ X9 eh@/,ŀ!MDCCI8U5A@mIY;cgh PWNx#11 &XdԈLДTJQ s2A"&a1 <($1M pcLae@(Iu a#Xnj[!qyXze^Grz0.#Clzo6~V4f_%)klX/NAus¶#]mx4 ñ%0]|&3*RK(V!0,i}T\i{TnW :LvcnyԔ}P$w(HKMkT"J߮{n8T==zةl]U;JZz(tBBgAK0 W䃼ΰ=oϑyyb{TDZZeP[ NuьWAs]UAy-ĒXqKbώVU"jQ hoğ. yUh4D^^hD`I,ߢaf-V.7K ]K^J8M<@M Z^_ZP/;#1㜟7MNwi~Va_ :K,!I}>d#TkbDBbR(fXL3cE&!&IEeYf#2N*& F<4l3 cY|`!A%BIKPe1@&Rluqb4a`D+QH Lp„L&Ґ3#mEc:m1`h9J8c"ER$NF+.e3cQ`JP;Tm>75 c1 5H wF#¥H.tf4!lqBxb5jek0 GI\OY$P-RwPy3sIphY+ *ĩp[ SCUJhfaIR#bXGmK _Eb%"sΰޒ׬޲\Kw[Ld.֯zKnH%B3Sm+9!RUo)t糱jO{ը9Ů\Pih3;0iHƨ^+O_0:Mx)G88^39gg:]2,*Qٻ)t҉0'3ś۽B瞟OzyA/MzKs;9lgfwL9#][s6+*E2C6q3yW[VaHYdJl&UɌk4@[y@=,rdy 35PƎ5u:hC|`Trݡ$')th̔8h;4i-}~Okb {T0%rO5,T{V%%#-2Tg%i;WQ+ufrhl10Qwn^2ɵu7neh;WQ++lfؾuqA%ʬ}CVs=Y@"sQqyO XZ#Rj9dOx!A͋KuD%M<5YVP 8*ӀUQbH g(|3ybDXbNc֕reTa *}J9՝{gdeh;WQkLkݰn;X/ũEVuOIkhАw6:͵Tu1hl10Qwn5"}-CVs)TX}f|l gk& 9/mdLe㞈0U'T׮` %BD)> ^,RT=c!㘇{o [!fu_5O٢Q/H[ ި7 X{ǣl6fDRϏCӋ>WCx,"'(PT;K]J_L~ 2klUl6 o=h2@犭+3A>\x[ckg ˄k=w`gL3m. =<$Wq[fN*n˨ia*F)4r^Nrbrre[9 vm2iQ9qWW6q,)uچZX6kx7SzHr> "1wrN=<f\)m,UHK&P馗3p7k/(CQD}8B0 !.>P#{$``s-v2s7~ɪ͑\g0?|s39M8oHB!>=sG3ƭ{Z;o ؛=sw`.YP\JݶxM_+6WMC%'ؿK6K_o6w/IFe{JlM^MwozDdjm%g6 UوpYtBԍzQ8Ce7OV2]z o:ʔ%; c9مp|fA{FL}L}`b[ns"r+zPnuG@ 3 gQ= O~Yg#2OŴpwl1 %uV]I\2ueEI*uqA1LoZeBfm3Y"ujq肺vc%iV72hiaj`Ui"ũڭc!.(c`!1} \΍YPĘ&:.$1!B'yQ$&$Q# ~ yB ɍ6B8R@4b$bOD$dB50E!&4AЃj2"SMrz{k壟d+IKZwe:!X#~tǗg^>BBp\n_AyY|i0{FS Pt>vlX1sㇼߨMDZ3 !BP\|3xS BҜ 3dX=Kw,yBr(haF(t;߼E)T֬ngG8բݫyONx:bj,gB/) 4ɱߪ3V~|Exi7~o5IY UD3Y.ԲkP>b1pq>PB{yţagL;-RcӬ4WXuMd&qϟϗ 39JٺMRh7Vۭ ف}Cl%A˜q8ʼ XC,Bʖ䕰DpwUtf jMf o@o\G,<ƲU _d9i^ S(A[딤s?:S9r,!`[]- 3f&B= Qڔ("̌ `R@- x>ƱAbCCiQLbDO AX{ A(ZU/3Ԫp6^Jъ3Fr9[Л($^A-z`{ ,+U"e#mK97\exkm2S+봰=NlF\a& Y\9W]rӢr6'$>QOIzr]eo*g0UdNJIFz c{R$R]SoAt-G_@ [#Ќg(aAh1dXSclK|Wqo.QXXTkSp$rBUF5DQbR3\uKUF 9E9*q@eiQ]7БJ.QH oM)`I/魯U݇ؗ=&xHգ3Hnc5߹eݻ!Ρ>=3j/Dz/oGbl+_l+ovE(LټPϻ7Y(4ӟl%\+x.Md:6l8l6=t~6֮5i]~c-9-Zj2{)qCcF)qrSEUN]tdԙz-t{6A{FB  ' hz 8{dzW&X{Q4&E/?! 5ӡ78JT7aI=&qW;sey O;CA =tcWOuS4J[%ǏںNihmު8H:ur'SM Vr1ֿN&itƳjnU2&[! e"6ˡF vTuNd:[ϴ<0 jp')˄}X!;0< *~"uY4Lba̪ՇΏXJ7\tPzDᨷvĆb>HPQݖS@޾b2r.ܭ-︈2@ ߋPJD!c(=ѐ 8C @!Ve^yCPI@zJC ` cĞ煑=jD=bd H<_З6bvu\-:xhl6y֝pjtǗ_>_ߠ뗯(ByY|i0ҥFS =?On'=ˤwoBP,7.B>ғv;Coת*ց>r&4'w;,c$ݭ%b Y N`[ <47(rw^↊RQ@Q7RlR[a{*aصN-=$ͮԈeLZzZJ (RbXh4sjyk)efZJYb2+CRJlKB1RNkMK}+Z;E[K6R5XFh!ovVkhjYk)fZ5>֩i)NZzZJm)lٕ֩sR]◲&haovƒVKZKvˮ>R]Tj "C[!!$6V)Iϕ w^"Hx[ϭO;ҭ06`"Sr{">CJ~PIVBj@wHjV<͆Id],dH$pB<9iu&9v~.h5{B+M5!m jG.cr[/pU ϨRxb*FXbRdU偭'ݽQƾaA9I{I>ʉ>*Qd yvT:,-.Peи *'ωlx*ٰqdɦ jl'T|2+I׉}^"^Q'ewt!8 Z*?py'!Pm^(!jooȺD đ| ^/GRD^E" #*ӂֽLBK00NZF8,+%lcZ;Zjfwm|7NŦ0]JJN !)%Hgyxw8|Ru=) >)}y?~RJksy%ED%ܟҚTS!K)%ci?CbzK7>.:5Hϲ{nu7~ i\giek ,Cc8bLP_Tz3\l6J5EQd,\G-y|S~ fQk5hn .4&O 4ؿ-Ҵ 5_ݰ½_P1;;z xvQ,^)j8]Ih~խݖ|S)F:> ж!?Z%&pўm95w^GwH<Xhc?8=4%VG;2o%Nw7 rÇ{ )G=lxbϹ BۢmyƠزz O n1]sp<(BGcqCwEN{C8ޱ.(-wcYQwǡu шSq~a*90bǩaQu)! 5lOU6g%gԑI~ÌJRYo٢<;<">]̨ g =f.gYdrASukӋź,k28}-/,?k92wˑV?>N$ql`p/|nLO&tjkñULHD-Rh#µU0麫|o$y"bѸ,N^ȬJVLOkwc|ZhȕKɫ?L|dqMԻ^NNR>p Sj;y? Ge?0ZtOϿ ;.Yr&"3d1pݚQR7Y}`4r3TK{s.˶B͔a:'aѤ٬ug/B_ !(}|Kd&WP,S.ϓE\7|{͞M)f:}ēP\J2 K\DUS@{4v,k\Cŭ/YfY.'+#jsCW#_u~lbڍM_h |9F-EвF$Xe5G]C0\u5AZ3spp1}XpLxD2x*1O>dhgˍ˵ҬpñbTkL9?>z_XEsv Sa, -5"+Xec+uVh\t6B0˄CCo5 8`ʢ@km!M"{-\eeX/̯ Dzy9C SE"U)_[ %>ReD\øԧvxarkcgG.6mzDZ_G:[tK+aB ST= ]hp^ɞ}WKP 24]sO/v1:3]'$q?TGF+ 5r Wba.$JƹQc9.2D(C<4p_ugh-HC+@6m;|mn~[g{[$i"Ĩ ;ĵq *iAKZ"nIM6`#yG]=&ݬgۂB([V+M, ߄kx,f1B{=9XTs)Y1}aEwW~UGfzM.K`@$m,K7pk613w:Xr.@f9at~%{V5嚊-MbXʱIO4+p]o[!/k(Em$Y6[k/NIkYO|wZY';0 H ;n.ʛ+o9(M>shԠ9l( e26Q,}4汎iRa,B$b;2ӵƥsKhϨ*M5%љPӯ0qe^J,@׋[f1EVsx!(P2"48vR5F(fIxd8߬^6ii!8 ,6:_"Z|&~E]F 303^e׶زHܯM{;&(\ֶOZ@Ddk9:з 5GA /A?Ǚ5Kk'5Gg:ÛpSڐNφb F(g>v(=IuW3vvLq`rTHp#˕[DNĘ퀹16J 7Ak o'[%+BB؞GcWWk*OEL}s|}z_/+;7;Q z<7)N|Ŧi\7 NdцɢG丢aMkd8,P^㸢$wAn冀4 XICtSBBLhp$I }%jIh!zɜG{9;qf1"7j™Θ )CcV8&(v &Jj0_#jvՂ|a!ʕ>lqIME"4JRHI`ti(˘Y͸L+Q#K.sgd̵ P8x#a rfG2ԱƆsפ-[{UX"mjuǙѠO7oA˂ Z8폋;kNny/ZvLE:K|9h4G1"]x[_l)|}Ϩz>: vs \AkBtv߳7PsWA0nY~p^G|P Sm<E i"lTjy?>:nk9"h8 Ϛ>5oAnUd_fB3%&cm<|Anu~}{w SbEi7{H1?WK:" 4U,ZXr^U2d!Q'TÂt+k ;_vٿg==#SRa4C pgf}3Og>PGNxV/>1Y\(ܞpE!)3NցD7a$R $@QjlLTzJˡ u 7pT\r_R{6`J־MyzZ%R!_Esh p@Fټ[V*˶oTMHCÚ"9 H(`,ցRV\08sh]ƭsUGBߐidµZ,.39\gapw4xT^!S@w !/WS1_bz㣻G_!!G"KTy 85bqIe޳qlW}EHCzq :U˒JJu.)-WKr\2cÑ:<l҆Gt惫8TұfVEȲqƜU` bѪmR^mzdƩRH'ey>>TSR?|V;E+>s3P}J:W3,K>\hF ( P쯓c4h7Dn @?@;F}7˵Ej%V)5nhuhi5'$u)D+A*#4&Q!UfHF$rՊ+2LdZghũ!J 2FYNH,x'OCށ Roj ԀU\K4d8sqqAF9mGm>jQjꎷBƭɴ3FdBbuyK|6B(5˨w:'Kf\*ySy1W-/6A7ȓQl?i9OCrPߦp;xM:X(`DV&WwiLӵ&LIRK/P#NՌ(P"ԃapbmF3dx  0i^ֈIۙv" #3va,rDA)<T1o.SLqg, C? zd_t%f)2}w"y73zoy" D^ P*THu3\ANQLIL2/DCXFިb֢rA2c8hS#@I#0qd A˳NT4Z!>hb-PkPLbPV1x*IڼV#U\o;f@=jJ'h 92PJ4K6ܖTLƟߍӔpņG(Rzp=(OL m;k^I}MAXV|1ET䒴07E\ϊ#D%1FCt q3a<DBߦ1( lg*m|q[i"N{@7C.:O4,:x\NvVV$. 8ToQmcT Jl7#uvԊMkǠK:msċ  Q+Fg@D͈Z|̷۳ LB^@"\R41j 4'q b)s5YO51I%pYASRQ7MFYExTp51v&PRHU\#0LFA KvL'އV e\!oKD0k[:%V4eJ: -W$zRaOlKv殖k@,Ug JneXG݌Ss3 lB"[MHbUb\UpZ!BtUVHB%/^?r#g(r1>DS]B + "{ FyqU3$ MƷq&2C3 `G.x-_St,6yg/WOYMUuۏڰH;7-xn&߯Ш)jGaF1Wh\DWҀ y!'i}ER$O_~r-0JwZ[nG g7-YnI:˂L 2@e!^R8u+M+[-v(<( 2 \r/W rr8dN8QReP Gp[&R$V31T5B]{4{|V IDCfwBjK h Uwkn\}BT`-c ].@FvI#L㤑7hYޙkj:wFXt$紲oi Yq)z:`o?4I% %~|3WI)DtM{on]߮nʟQ]>YMJ]sͲsOf0q*rԠ)J^֟up"6T34dNxH^ /v;.d4x;0 ^aolts{w;=qy$d|Ām0J 8Vrӊv%s=C9vWlޙ:!6Oλw݈Pط-'8TLM"!_@b=miŖNpq-0)yP+Wͧ^BlG{\̊|i+^{1?s~Q-l]p7}(t(V-kkuM3%":&_Gz֢c=y'QwFӶQUX<`m>RۼTXYoCN<"NhEBM];tDb~^´vu!3=Hr!ClIg nGq硓h܂[`0\x{m\4.`U4nQ-9ױ&4wU@l\ÌUˌ+5ׇuvJ.QAl]a?fwO[VX[%;CQVOnu=ԔHj p9lEZPYǂP2!\ h4o]=( +A4DYяkaAE=%.ШзͲj>nw[RX('ӱvxhkq;>ꍕ#1UXc|1.Yv0ږqhU) c~ǵ#$"CqH#HE21:t{j/'mhq!MECLS>m;weޝȬ`6"ɫ]PM"sCHk^mGmg2 %=whhrF13 \Iľ \1rQ=T JKoPQMOEĵIG$QO ldvqW><^OIe|,,ʒ\}'xE>#^P<)K3DgS}8^)#6>JI6{6+|;'ec)JFs1\tʥ Vl!oz*0:ɃCۆ(I(;bI&;b3G2蛨mۻ=%d9xysrv.#G\šPc<$O%"&v@b`o rEP;GUJN[ AAO4 yA{k*%$R,x%$ֳ=$? |i:õTi-Hf\t(=\e=/H[>12ODĄV-~usrno⻐X]=D2ص}8Grv>>X}/h4?[AKN#0Ep8{Ӄs>q[l\~jp-:O3*n ]מzRE{XvFIya #Vr#\D*v2@+jA"r0A܄qak Yj3 a Z>sp3!Sv NhYf6͜WYb Pj18qqf9ǣ<+)LMF,wy0(*τ#u"L^9USxoz}Ʌ^ M r`bH|b=Zw甩 ?B+?{}ݻ)_`۶f1'^m>nW,b= I5OfRea,@ڱRTpN$=kn|1Żd2/ P ? |"D{ T>(zߒ *)œU\*a"#7nJ, (B5+(aEb?*ԧX6/3i\w24E^L(Ǡd#>)Љ%, 23G6{<W)B\yE6%u6,]xfj |w0@D ±4^΋e^hҤoô J9t'fan>^wg0L+Wt=ԗuv4)D9)2MJ 9ȓ.=n)M(bI$zDś9^kF(IO(/6R2piCeW$FLD8'<ˤ|.!ӕgٻFn$W!N$^Kܧ]lqα YN2)v%YddEw"Y|iR3e=פVW['QErB̻ +aSV{ŰJ" RPaESj)5)R#^:}q~V7ot!!C )T"(Ty:CC"m>`VG&1;~ўfJ>aL6l)/Wur n~1dU)os*eI`NP 7zw@5h۽ gc&r<,"D(2$g~Վ`"*)e}\D {ȣy]8]fDph$8{knK|JOvk\RMls.Exy}Ԩ 2@Ԙa[2m(3Jz%;KqVVWjG,R`NR۸b\(J3܎HSBo{zc_@=U =$ lZ]d~n[ۗ?OoVTk65Č,|7a<}׿($%SV?. ToY$,Uoj\eU\w* k nLaލj|yonʡVQ;uMn= uE9Q;NXsVP1tuI)B 9RC / Z;F*HNkvOhMON8cuO>:3=Y9,_9{UW kߨZ dǴZq-Qk[K-P2)kA8&zq \1~9{>\,L&зϟfs[Hz4gstk/򢶏n†ibiԃ^~zyǠ}w7Ev~E"FcUFjkS|[c/=i/RQ$ߖzGj/ħqT5lC1Q8Gpp%.NlNKU\Y.o\!?E@M(4{.QHxe8}c ('Ct~%X^Fg;f&Ԩu7'?:Ũ dH~j؉k+t,B@& .@!8O$R\Rȃl@/>ţ8/eEd%~.p#R\j1R y>$֪r T}K*PSS,Y!1֦*1)ڒaD󒕄Ub+ T8"-"Q;V)V`zC}:{BG$ M9?o+8n.}W/G O 9YU7T)w4_'+6Q"+H RꝬG[nbtBVfQCY4I%mrGNOzUqMXzS!RO'򗝇N'Us*w;e@-g?Kw}֝BA%#oN+1Dj>l/ +dme0+<vݔ13uԚ$,Ѧ`S&b;3w,OuۍnVI7btA#%*,מEF.<]4ǧ5S8w_)D\ĸN;Rۘb8ѭ 9smSJ)u~ {y BIA{#4=( +I-:9jU6lS慟3a®I1!x404JN16TQj.HQˠFpaɤ}V7Zz}15%Bc8.8UsƐ9N 1ZU1;o^c͙.\Ylq>1nS(xi1& -bhy,aWᔝ4˜Phe~]!Ov~/3e6@~'ЎLiȸ xa) p'E18TV 0j,U+ϑqVrSZDQEO be \yHIj%9ui Htar`;LTFP |Xñ,ISeT* ÐFU HJ MJ&},#(=4 e*k)@{hu.$WAH0'^{fO"%x!$GD< "8g_RR8L͠vCTP߻ǧ床iOˇ姿++XVy:UxkЛ~LxWI'c j0|XX-] %Pfw^ߏG9*fw9}ycgkn9ׇǏD?|%6};[vg;|#lᪧP0)Cɼ~ z@<+`#OmPTp }9]Yp:򚯶H[ᕭb+=bugؚ^/a2~`no &paPgȠ\ &PWۿ Įf-n4̣]كsmއv>> rV&ˆ qzD@>zH+{*= i[]Ұ4,UɼvtIúNl!@ڵ|{foFUp]A5҂anp$!phw&~r lzy*ST3p4%eXS-F囦x,߸ijx ,4km|׭~1:M~Fc@td%t ?`ЙhÍ^ОG!Wӓ#Y%Lc9R+DF!_7Eމz~i _M-wːQ66yzNX?Ho)XE̮mo/`y )5W@wr< \~ׅf'-;MoQ3ߑ;Z"mJ7I,or.S&qD7&e jb|GR6*B#{ݲnxșhçDO7n-ߘ'EUw*NOѥb2b2I (TLV1b uP}bƑsh ##X_6G_j}gu>&/4ee+Q*Be僼Xr*T>cíڊ ?}v/n5Sv+b5DE 9݉DαUjm B=^RfM͜sD?\e{֡s$`Ay5_V:Nakw>I4JE`üb-4Rd.̸Z s֠*޻)ja&0')ASQb00%iX)ʊP]% X2Ky:n="uSXNkvN,܅](G7&Qt ŽIBt2d/GQhqM'D#dLI>`'vx%|ΑasJs HuPs91o8ۏ|r4ӟ?G;އEy%D{Y?XUIߐ/%и DPTPFdzLL|WK9$UALqcs.X:&Db\>`5^^WPmĸUtFcl-%5P"?,Aba˜ e )W`[S8^JV`=6LRfk?:Ipx7e܁ J?lFY* hj+9e)HiO ؗϾwy }۰ߦOf'卾}S*t>@:#nYCL),knFʿew|U{3Vmn2Icul*lS AQTflQd?>6nkDmHBMH[?jeHbbGg1KZe~Dsb!ӘokO33W V|smf)^@F"(zocFTn 3ܚG #cX5H!qvXhMrbR@3 -Aî"bMzrAT{6q/ oagHqjjRZ %iQk ^Ah›Owf *T;*@BEF6dZ6{:7lX x ) wBeS iSVXk-kԊJcɤ&5W#] |Bфn쏋.|EB^9DC0%+Ȇ:գX*1:h7>ʮ[V4B^9D0d3 xCmg4n{\E&zn<[ym)v#z#su\f`plae_(*5[Nv;bW杸hfbhtR_>).>I}^oiͷxv"yvd%vGpX<Ѳgς?|p=o]d:t˅ТX==sgH~W^CE[f[H-9W-sfz"6<ꮋ7nzRdR ~(͑ ۤԥf#JQRdj(&.5cp68zC)U8nzRAKR$+)~鮅VszD=uBk1nzRZrqR~(b^j P ԌQʄJ(r:Lq^z(ER@)j?R3xDA{Fq:'4YX"mXva'uaV6PnEj3#̶k ۻU%*t0v˺ЍzNiT%6vpI*Y{ }f8A Mz3u1E܃lOk'&1II6\q"3CH\m\U*U{?ߦ{YhkM#pgnbTUڣ{r=ǁ#߈jMa8 u. yROVx3Z?RO҈#2EjJ0&+Up*ϣohbkY!XT~\딌H^tqE<8Nuɕ]=wj] ^ގUnpN8$GA1cWR6ve> CNXat Ct:Z4CS.0iфi~aߗY[C1*jėK>Oi(scu &M$Q2˨sMTjxÁT3B0* m٦/¶ k{vYZv)3!_kٕ |kZ?#BN?שּׂW?M%xzY$ A&5 Jҏ/͸vd\M;.;,`kc+ܯ _~Pv"ztxj7P> 6pkj }:T406fvܯv 2A,OcMV)/+*;2 Spp4TEڢǧVgCL=]|AXꝙr+)}"v_2g􃉧nY_LfZ>IK, {γJ]M4MGW\ޔm䯣.s_+<(+G;\|_ݻ'׋ͣVأJ{uzw㷹g3Un2SnuD7㘦̯/>G`:ڥ1`c7cٙ%D^lNdzr–{9enxE3Jv٭S|0ird1K([=%O-H^^L@cyŕf!{W' Bmg4n{}Y -34{F.!)qF7l]ct tB$sP[t oݺ@+hkL9"t)!ҳԒ3QRJPJi0nzR(=hC)c'hJAZ#JKRXR!d&w]$-g -GS)bSDg`4$ \$(^IbI3B~x)Vu儋 Gja޽krJnjx^9"ϫd#r%%^ƝݟW!%еrS1oI̿\y& C'.ݮO|f\{Sb-.ҰIQ-.+@T]4:MiTZ;|=0BgՁ 'Xo_^ax)yGXQ)Xn{LXZ+*!̆?eMΠӿKuAe`:6°1j DtfYo$\o vޱs49N$ğEqYj5\ cw ]3քe|zP=5\ߒs|X>3OwhrGq|kVIjv`%G՟)dV?F@ȯB<jR=﷏aZP䜞3 ŦY~Q}rS2* ]w*DK+Ï~j\tA;\& qӂ.c~aY[Cc''JZT89DrȓT 4SVݜr"YTc56ēr ye~MGp*P I ſ^f0M&k\J4џSk(kmL[ >,>="ܘtHbzbeCRkfDg _RO_Q+0cW_ψ4DP&-* a(ņ^Z=M;Cxd$ +w W}m+VΔ,rv7#?!!En[G+?{WɍJ_Tq][`c^{ZA@}ܖ|&,q(9d,;ԓH|]jJФۨJu"ZjE>wF;R['CTKFL E4j``>!)8a-U!IFw0jjFn酛ޭr&aJ?=n>1wKAtR&\Xy@ޭrvƔ,a:.Pe?(8xr2Su{ܽSO8}7rZW?nJ??W~7{_p|,oCaqR~O7]%+j~VĻ5zLQٶ lo>QJL6^N/|/J!!x5B(aHem UsgaV/GHQؿ|Z/͊vSno)WE j3ol$cƆb*jE SU5Wp*RVUBv߈(}-M*u씗VVYj(@u^)kV(!kĒD ::i-94 Јŗ<}zk_~خ8-nAkNN1UT:7f,'ZDEyF4J!0L93#$ʗ ICPdzl1J'x<< +ͬ)MU3/+[K. ,s]RpQBQMj] y4CH 0]S"{~+DGBe[k r u9H)4NE;R߆ s/lS(1?x@w׸ĉ(D0Eacbnzkx!/k+[UZ)v!9H(L]Vy-21Ff'{mq$W!ض`Ƶ-DQZaG.9$k 7'WEX j[ْ$iY ㋨B|~F^(vElʊ1XW1ӱj CA9eJns+ixbaJ \b` l^h-vdvHcדcq([f-8Zh]|}=-b~i[sT3 ;n]OqfZ%xq \TߠG\0t>2R PG<=rϯʇx>΢F5̜P"xVV<~J+3 s0wct]s",:H_w>V/ࠢHjqv.H#^x;J %ME#K:"5Ln|-kpCnZN..vsދ]J hJJ(_/Jݟϭ#._-~~{k׻nmSibHn/:ލ>fi],x*(C8`VWLr0$ L,! O'8N:-70}q* YZ*jipɀ2YɭȉP]1mjC^,KF1j:fTy.1@G"ex!t<A/Fԉ"+'Zlݲcz>Wy)U<1@\ckLX֯ r WJjD NEVd$1\R- *ʸuNUPele+_Uۚ@VY2/2MXhqbۗ8T%deBnꞑ ӊt1e&sJjBUJq-jY]h:"j_-!`"Xc@I\+ p8c Ƃ&QQNyUn`.M]m_T9PR`*<-NFGE2Pr/b.Z2KYf%c F[28*Ҳ, ԜE)Ŋ=0qbĕ<9Zsҋ/}P&SdPEB;!, ŃbpKL{^'Uak;xÉμ2v\cQ$q `C1p-7ep d'2FX1tYJjYx܏'ŷҿ|1J.R#'Z8,ǩON 0Bv?yHB]f@, =fl$l7~?8EV=45sTy;Ά5ijn NC/C#ƻ(%O?{ ]?=: hE]'hfN(?t<+g?Eip7C&,)D3If5NJvjK^jHYy+mhCR`w c]^GR׍#Mr5Iw-Ha*PuKa)j7@Pk Յi i R 0 4g NG9V7Pʐ,ۃo v#pjr'֩zVYf&;LU`yV.d )@WZ rbQ \\ś{. ◛[yo>N[.(o&^ j28NPᶸ޶N *rY=i9MǛOovӾ{Ӕ =̭t~ܟ6cv{ϊ6wO?ϸl>/#ℹ> |fu{Mrf9K_I38u#4vDzIҨnK5[QfeB$U:,n8p5'Eh_&e*NVvwd8vdPJfe j@)w&-ҳF #i%o aWT8 t,M/Ial{%Tl[p\YLn]Fiud0L9Li=+]kR?ttd;=iӉ][tS&3,#v4ˡy SH^ +)F!qy^:8RTVûBoq:wsB!UjK<^k5]]窋{:K٥F$5֪]MXV{w |}feAH(hcIaľ|vņE&Gރ/J ۅ2?"XDo"Ǡc <[Wht#p, ': s=pɡU+fxƷJ|R4JukXxor~yiNJaG$'* jF2ͮsT- F9 UN@8[V "Ն|- 㬅䃉OnbBgQ־4H8!BO5?k|ԏV Gσ*G ߑ с +![=;TĪ@y}j~ :8Jw#f$ DGjdwYg Rkvy 0t#6b9jku^Jrr9Pyǻv#eSC.$IK\tI}~.>ݖ.>6%K׋DzXKt_^Uͯ\Jt0m%{[^U7YEvE>@UW-SzB+h U:H݈[zT@'1mTͻz>!)fqy7-fIs4ƴonӭ"p-CpWΘ,B-/¶OynP&psk]<-(/g}qP^C) oo4 zWjj]Pz(nיAuݮDa(mFn ]Pz(eRƊYeJŗ9Jr+EQ?]ŏ=_diZi;q.t^y-k5VV):߻Ѱ~U>?o66h`^UM@I1ncr/C<Օ!(ygriwf|AY4lѩ6NLg-:7RsRa(&פfRa(u:ߺ|{A9TRlA=TFj&yR}%|Rs𥧥ޓytJ5 JA{Y7JIՓi$PHM[C#JPj= 20Q*#~oGė}!/gRsA?#>JNK}'Ha(u Y) Ԥ/(=k C))Rf(=%ԾgR׻,P*#zW.gRo7F6pW#5YR0a(J Jt-_z(% C)aco@i JE `-f +0XZ"mEk)֙Y,3 } "mLg&Se69Yn|HGhp6߾9Τ*\Ukq[-7.(no)ݮĽ_$5sH\ӑE88u5ȿ~$]AlQeǁi 1JxwgA[)xп,үUȘW*Y?J1]7Jc,)IA$҅B޸GRփW0 רIysvRs}38==ln-ra7)ֳ?s0gb's=TIl'b6Gg8$m DGgE*c{.Y|ЙD3=物n$ psB)?N}s F Inb ! 9bu(IsViKJM /T2Czy!s~Go/zO@q<)c=T(rev"[%VR,Tb L8R($J*%F9Amlq$x)鵇,՟mM*hņ0S~W|llGX hС怷ZFe8jY7ZRVMfu6+{il5:meϜ18\yY7yztww#y*Hx異G+#w &f iq+ņhZWVer+)J5;L X8]H ğnh`d{F A\^ b '(%0*JAKmi zR+UB!'Uw?VY ,ꦈ즘",="3?(0X Nr!CV)LŨd^%NTpt fMn{MnMn:yH/!&8JD,*b &iˠ6yDJ:f8Td`C)>$yПއ//a 9wߡf𝛶l^r<g?l0?D2#+hVdH ~9 yB8 ;k&4l{bT <-@8b ?8#H~0O~fkJZ(HSWj c*hJ S9Ɩ5cm XBVZSu 8cSTIÐj2گÍXhAOz :sqp+J b+q8տ*j3;||&X&Bp ,S$pn㎇b+T,$%'j()smΞȞT)4O+oM#S ő^$ e<̂c|t5[vL~Q6 N[4dFc-NiAFK{\!/;bw#֭evĨƱn1QH#(MBl3:u͗g'Mc 1l=%gWo5}BC|ZU29Wm@#%A tFJ:T40ѹQ^ =A1a'WNEæS.q;u`zp5[)GtAsgK%<}Ti! {s;lM{*VC#V1NMߑYox!L0oA|[5Νa{7x79lÐmxuy}SJkr!)l$2ej.@rM$h)f҈[" W NrI쪫{-V%UI6֯~mE;a)l]4ba*}SHF%e'6Z VYlwğ)!!Ϊ&mwcPp|6o@){s~eA&|O(FuЪ'Hkb0@( Xw]A~M oD![lƍ8ѭ1_Ung0kJ(EG=gI,Pngg5ޓSA!fz}:ykhPwz*5Ϳ2'nx?YνNbIlv}1{YRt=şwcL}e~k̰E;W 쓙<Qyu!u.gjmEL3M?qWsSk:@]4{Ґ/\EtT-dA:u FurߑbݎwN)ͺ'ɏn}hY:ɻ[Mi[.1v%OTѬ[~եHև|*YoT_UvuΪ=OM}1Z6âHpy}FFR:eFSȷ~}/"}XؙW<]+}sW|(O?YV{6-W[WlCp?^^zXyO?~Y,./!SWKqYind&-V^Y<t2kuD_;Ycq'dQPBxYقP AC1`0Ļ IM q"RH E'vSc'67^vgC%A}@~yG.gaD~m IP Y3Az_z6;?j>p_MmZh#űhiQH.6%qp쑲 1ɢctvP{%Yjy6.q"P&!GX?>5`dMH%ɘ,D Q2Ð#d ѾU{Zej% L]LARݽ5TE`TS<^a0D %qDuhݸL s͗Ut/gʍ8ti37g{]l:2#eysC'bOr#eS@pEz|e8E\PVsI KcvXn50RǍB1~Zb;t{%Z$ |n!P%Q2Gc+!V zLZELRRRG`đ0)(F9p(kStL ASV0͉D'%)ðs[0H8Q|QKMֻH ` gbMCrN=1~ %~0~^n6퀶 2R9^vW}`{0gcjdcfrOѰCBz6H -Iȷj >AK((ݝ;"ZR=f0=f/ki/YL`O )rLD.B:Z2!$1=5)B(dH)Y_2de͞geMBsB*K"hi-9Ҳ."̂Sʅ@5 Ĭq3ժBQLku6щY֣$fLEnX! 6BNce]A.Gӎ{O98sa/ SvucdLwX"I&ocucD0{S}'b(>^0dcrlzn$H [)TwԗÁkʯ`νFFS%ڽCB*\ b#%i Щ|}T>}hY:>ՙoۺ5sѺb:Hnff"[hN Oܲnr1QwXc"K2w Һ!_t*2U=g"LD r3j󜉰L \H D#Sw?yGZ^;yGQdJL;yǎ49>hOB4Jh%Dj߽0NPHP3 TS_U'=k_է|q^*uJKwRjBTJ-H 0R+bFsP.e^/x&A0@ݶp,a} 3'b%vg+FHw[\H.XY-P a_Eh:f$ Dy^j&=M(iY5F#^Z c@ymxd@Msp$=غj0S *e!Kh)INjyv['$~eIZSScc${g,tƻPvo?2C~[1h:k޺u 9|Pyq wZ?g?4ٻ&m$Webw* qWD}j'&^;Ηv]FGtLo(K-"W"E ?cqS<3ڵ7^C[r}۫SxD@g:P4x{ItA{u]ϟdMiPYG+tb^\^SmQ(Jd6mkT% @u:}>ٯM>cj8|2#AGCki=<~T2}猭 !JgMoh'ȧ͌fPaQvr32aWw "Μ1;6,">.`E OձgN";IЙ跦ΜrTl]V#iܜgA$oQ; ^h[gΗ$,F9GN*$5IamˀVA;V/dp rlReN`NQ,rv_h6BLKCux[pq 4c( O 2= FYpGZPڈߘ=>#=MO(`U.ؘeW _z Vg[M(u[S9.ssgBjcfݤ1dAP!]iEB1X&g@i&Y"iR!i09$6d8MI&Tm'r(Kbpëb-@"ϸǼjDYV&V6UqcaqX~eb""ȖNkؙ 4ZPq,\ABΛ?fO.QZnʡV W-D@Y\n-z2c{NԒA>BXza xw`g6Z j QPpX¸I] ]7*ϧW{̚ugc1o5??q&(j Kp# ` `C*e"6,6.|w] J.MI͟g8|p.B7F𷹗qB,ft4s8?ר'(zh-l.=ACbp)=TIϋ[{<@ɡ0wxGhE E!/ Vfz{bSoZz*[эdɖ<\1^c=)_;h=;Myw5#[QݳAhcE'gsY%x6/邥5pɎ''jCO4X !.]?ׇ͞<!pi/{پxp|%݅z=<֊F]u Tz YX-8֒j)z6k ?\k(",`86y6xn&t+jBO!O_BfEj?n6ObLvBxq/^ތRJb07B4^_]? _#P?YYChM)yzktc;Qkt; 9嬻-8wtkBq cGWۣ[v- -v]Bu6{nMX7n6tQS:7/Oւ] xs?٧Ez²Gx:}|~*OԄmJʹNEBBH҈mZDqjZV!Z4S|-`協p;^=IW6kbELXj/EGi歄SA݆L )⌞NL_^DUq.tikF/vnKQDޚTօ_Rc&7sk[Ht>Ȏ$/KZPBh.HJX7mP.>&KH=7$0#{#$2{g4iޗMsMH.Ib(1ĪA(-nM(JeFK2b!dž/$XP\`uPLCSȽAgV)s8\AlH RAXNA8,x M,ѝgizݡ`4P*ۂ}q F #*D &YdiF&!6WR*)DƹIcAu\@$OݶFϠ *Jqb&WO q%cTi$:ךj,'F,&%$f*0 0f"B $6Ŗ%Հ憖[NQ8+'Wxk-fQN 馲C .(?9y6tKN! P)'`ɲk?x"C*Jj]CNne xwT*ac5(j;l[ZmPI*Y ʵ5H֙w

\j)4O%*:7O%?C0Kjo{o N^`?o}pư NfI[B*7S-ELU[Hڡ񃶷Y]j0_]~G TŖ8ݳEAcgzaB*5*钻5pPs?vwPu.`v:")On 磏n~<Qݞ[{r0lbX/^|v{;C f1nd)!VF_/e"f6tj44*ۈ G3JQtˏbJ/2s5Gs<l*;ZNA9ɦjbgr- 騉JzQ]kC))֪ux@ 0y꓄kGJ2Z_Z`ZRSb%XF-Ӎi)djck _,7 R3覨}B112n'\} Bv6znMX7nA6hWۣ7R tB˨3rDiG~]stkBq=ަ\P24zާ -E&Ý#„k)&/ϑ2Yes_>m:;(ܢ~&E @R_MAITRI ק %;hQ7IJI4N mN>C~߸2)jce)f3D$y{ÐP.:Ğ ﴆF#4pG80ITQ* R8O1 %lARr2/SƏ[.eqQDW=ǏqizEu)q/'þoMpCL%8?&YwˋaK^Ib tg\3Tm}2B]ӑLe{ s,n/d[ಒ&A鶤/V/fmiI_֨$MZ# ]pIؗa,]iH,b Ҵ"e:^Q4 >H0->5r2QId.;VO9ƴ5;PJ#i$f @S!Z0\_[>壬 5$V$MsHlXیS3IAR3kԅ616/>ANBE.D3)8U*YM0wu%"k82(C,XqnKe,B ɹ6fGzQxjqX!T ݧ*DPPlGz!\}RGL~^hWgȵlwHG mo<: }1=DZnz1XK%]7TfX{hRoߴL)pUJXP2z)ЇɚKret,F#]j1(T n ]uS2*2}pso]sƵW86 }}p;k3i=q~qF6kJbIʎ]@$$w^FUF.&C}K,½mDU3@ieNplwJ"M R+d!BZhyLGuVw경J(I8@>Ge)CEJ;CiN,:^Jyq㬾,T{:JS=*SM8le3?b Mω}T6VCe(?RzT{u(D2RU*D>Ge)J/!V<=~F5eRAPJel~Ge@СQJP*X+PJ3)SQJJi 4)leJ8[ZqղCET+(u2}6TTSEEҋF)?*VJ3]UҋF)~(%[RB[}TT+;(e,sa?fTS G)p?o=~Ge]|#@i6.o?@L5u(hRB&@>Ge`ǿlRC)̖VxD)?fT!qҋF) ?Bv^mr u(d!9[ڊȽTTrRSl[@ש#Ռn]z(%¤ڞN֊~՞ K>-RӉ43GX:A~X%zQR xPa0Z* -L'kR u p/h5O"mIwo͢[s8 FyeB3(#ɞf%~ lFܒy!y[8 ֊|B;Ey>\vYt ޼no sE`#d@w [;lf['+-U:OO9-Q,9eŪ/e } '&Dw<.BYd]dQUIQR)*Lpk؇Tdnd<r԰Mle%2֓W}^fwRo+)<L$9:4iR"c?2Ev"H1HHD#I92d`ژ2³q < f|ɭ(!.C.F D+(0g`=ce eki XgaUa&5B"DZu8R(?oh A jo5J##8Ⱥ; ;Ѱ LUl\)(U!VJvӛ^b3ӯ#Bo)b9<[ZP_Ʉ2ej؈^=9&y3 r%􋷇v.b7Ymqk뿰~[}w0T!Uok1)(@qDr W|r|m{U?n<,Ru@{Gn}ʲe3R\:ͪ\kzn%+\V|9glUesMP\z3][03k?@oל~{{ [6f[u B]+/c' UolV|]}3^7DNkevꩾZjt-r!ԽǓ[& :z*DnĠ^Bm5~2|<ѶΥ"Wx"?n ,&OF^~ؐIx0o&dtS\i]<]|fa|o{:N~֓;2H[BQ4eog0宙ktn,/ze ˓WC<6d0;}ɡ}^I֞3L ĸp#T;uYwr- h,UПB`tePEsUI8ApN`9@(,y{M{Ċ"h2M&U ˿m8nEݒġz7 D9>J7 v->38YI$8~>ՒQ7! z˰ln1Wy/l*Ů^~34v7$ [C5T _ {[>}GNchܛy]q I^]yEM&~VyJi,9xveaw=$̢!!)Z7L8u tBQźqeuɬ[xtuk!sa֯87X:bιֱ[xuk!sƔR ˍx,= ^Nvygӏ݉r`ݓT Wןz2v ^'w1O=w\T2gG=-tzؑ~sƁkaI<2ψ`}t?8%rD'?4p?~M>13C`>02uQ7Os 㻅q&<1}T?Ӥ) ޣj/#)"ZIr^_c"R(y()P_ƖʙI'ٴߧ>"DȼR.'*iY'USLfY*H(dH11b E.1z%Jhh3)ig(JB3G3tQvW!e҆OnXu$۞UR(kr,i}~ZcvÓg%/zN зN&.`Y> RqLX BXKօB fj_TJǬU$`5_UV-(f S 9(б(<$s$IJ0#cR&&A 0_$d 0D^uQܸtg}/E؟S'tױgy( ˏ\á4ڌVXRth_ 'gbS-P\j)'(eR-*)dx{<sy ps(>v%'S1_BiRpTJ݋Kn-/]PO\<xBAO%>(_X]103k"D,PPvGo:1τ>Ε%Jp?VQgvߏ}i$2 /=1 Y>oZ^6rTJ8yǝ;~w[!IK;5it–rd90|°v3 t~x)j;v_Nc馪n3Mq|u@0;V:K^jC`tp}z(dnFMu 't:H'T/|.νIW !!>*+@)´ƱH$P"ΣH EXaĀ4fi+TᠻޗU~%C-H] 7ĒaRTDFZ3 Xh)O9ISÌ +Q$VP q/$k8 @\=tMp)8A:"'ka(a8"\锥,B_XIH=6s*d׵ΓSZ"^vN?Ws4DB3F/ !2%w\Y탫^coGzs(XW_~Ǐ1p ȂJ>|?ɛ4)%vɾezJ']!?!/V,Dޮ&ۥ(_n1[B7r׭: H!|tv>9SLR'R1Y.zi4(GDRbTD*4m^ӓ>~ROVy'AG%xDKL'H+ VkΛ, xw\7/owqh|Ic]q\ j&pkv)D"BB.# d%>nF.C#*JtBReLdHp-^GZc-t&1UsJye0C"4ZD0ÔH}2ӷ@ە[畦g`x5̤1R*B@GT!ccKI#1bI)0yaJjaRpf aD`~Wuz\ E`Q\"mp Iub)IEXRewcC fTѴ%UyVm♌EXb)){8б;42JHI#bESSe9)C iB9R-AJ[ L`,J%D…4KH)(qd!cTl)w%A(wOV?'0`/kz1nEzʕg&cJUy Ņѫnb=[MocMhw,4Uo4 >.,iz z q\-b:_zU:r3ѷ<0n#ҋ@a$K(k1aI8HER2k uR¤-ikI⥱tRe@)+n!Dp `Q\ Mabk1\k-cZߊI.@mI{ $0oL^LڣSӯ_x lq"Z]d9fHmhYs)kXq'2>6c;n1z >Ռ1>^Ғ|/F\/ LpFk [\z݌nɨ-lD%tL Y$4k\4=VbUyWFMԕc+q6(J<guxHs+On& kXG5񤚌vX-Huْũh~3^$iZ;)Y˥Z)W 9 GIy[M8{ aDWO< 7;|*H`~D(~b_@Y&pSMJgL4u\9⤱;`^@u~  ~W n2$t\/s.xN7vU3֜4vB;uʰIڥ;ZNI. 9/}> vgkpsWA҂'{mrUW]L z~UX|*܌nɨ0@)}Cu5Ttfɛ@WͦpTL]wEM4cHrUW˜9٬w9(˦QsP0]R:0IrAczZWTBߠ *JW+L5`ʊF4&TڳVyf;|j`NXtF֖v- ||,Ek^~$kU/{2_hk̮s}i^㿽dNSg}džr7+p=7#k.WʬLe.Υ'>PA9g#0Z/\L&9߽ё#=(9CYDg!/0 N}8iQtʾ#Dg}9֭WnU C ("8[>EV>))Kޜv+_\SU˜\ rq{F=LS0n, ~-1;ha9fKQ`E[te@"E4? :`5( iQ(pR۵O85y9y6撱\M;פCV.S" =Vfa-en䓅!JG:lR{By_?W%oS6TS$׫zV~Z^ _IMeC04lLzaj\tt^xGtL8׮9m66i=8hcS.bWX֌(HRb8h}؟r}ڨx+aB\,8xs'|(|mghx+AL8Us,x7).2SJ+q͢X؏Ǔ<\I=>("C8Lq$E 1jdBF,F#&cm(VT&<tXѽ18ћg{Hq#NnjJeEݵ@:%) !!YLve!1B|gelA{U}σݝmx?:>0[n~I"Z~ribE{9/{q ;D'|~=s@yFعUijRcBYH** )/N>>Uz)Q V/PU2f (E8b1JLTbi+ZvZ yJ2۟>c+lyziqc%|<Փ} äGE+?beM1@9'T3,yҋF)~(%i%@9'T;eԯ: E:sTO,{ԣQDu(;>NI'PJ=wjh G)~(eY (eu%_!/ԍ)JS=٧ڞz^6J2 CiF5aУQJxzTFXFWw~^-^^6J C)v|A(%ՔQJJ)duCzO5տlzeR yZӌg`C>E2:5Ĥʈ!4"8Aix81q*#T} W'M'7f\\@"&-2DPW'gӖMfl[ƕt"F Ƙ11W`_v?\"v5Lċ}nzg\`~Uزq }_L2}*+Y3OYlJݑ\ӝހb*|Mʇolb&z>*y|Wuc蔸bڎ *3a?xɵKLmGc) W*\0_õh`Ϯ_Pr)WS."Z`ĒqSIHJ`Dc@QLQ#D@:&+((Tplpw3F͛YEe١B$ q\-\#)W<Ǚn,2Ñ~gkF.֦p(0ؖHוpW#D} J^B)3#bQ3bq; MOP$ڕׂF.)pD$'o CɈRj,RgѰQ `zEYL$ OHr%Z%'L6I ]9R ǬRXtfT\gJ<3+ZdR!ʣPIwhҳi݊qQ>FEX -e4mD3:ގ?M ^D2o0ɼG72)@bdDžM7f!g[%||6jk$/ߜWn1j%WxvIE;x?|ضx7gm\cvS2p9_ۻշ.^Q!I<@U1E\))(2tdW꿽qy{b0c.v8F(Ir9u_yJDw&щw<qUͽt "N, ayTD[̪jU.dЫRP.ԤZ<=Ȃ+D8"?ww^O=eU~ 8¼b9D)E" |“^1Te|`^c?$HA R @LSmufL#cJN7A89yAW>" Sk{ *@"yxq4%&iSŹt֠)CD1`.FFm"=ɤ c9#I~ÿݔfitڤff|6Yw{dJO2ت F&J0HS5Pǔ $IJ"al̥ٻ6r$W|fW jVOtDϺc= jkkDFTDQ"%(D堨*uY%i))ɰ\<9a,Xdܺ%ґ3DAk;JօQ3 $@2M(Kَ"Pn(K0qP 9 -/e{}{~iYvn??Paϫ:[賧~zږS_>aS "ٻaZ _.B-&G7g/oި܍'3|_ӬLظ)2ON\OppiƓfjx\i84˔f__aӌ*%5SKqo)%қk|={W BpȣT Ȣ5?6A)!lL>qI'@[l3/6x< v''4(`:⹸qwY S ied3wn^kq'n Id3FTS=l%]u6P<~0O?"-ݵ0k%ΥzI+J9C{v\|s).NȔؔOCsY7#*z+K߹pt4hӹC: ai-@v~\}Ԕ WjB1Kk7ǡ1ZE2ϴw!xJ(S+O~AXm0_!QnwE"@u:`q#7.~$2&ޥJA@b] ౭DwLg: 7~]do/S]R%gvn*+/aZ'z&ށSAӴ$5Z2D9DkIC^&#u=ݩZ*p-1&֭G˴vak떞9iАWI:%{M {n:wR{dp<\EE/BX뤦sksYA-Qj k.1bFU/{F <Ε9«@T)~ řZ %6F^RvlԒ _oԚ{ 0jm#qӈ ;}4Uy4ZBPk Q͑<0BjIqGjMnAyg3GRڶwCDF8/*ϝ.*@*F(B^KwMx} ~nepԋ/)*ixU@Qh{UD b!M  ӧ`. ry(5դȌ/TJ^cOd嘫ᐄl^H}vw',cof[pqX.>ja>m%ͷwf)X1er\ޅ ^Պ4qoTу?~| kOfCp,j!kh iEqrSQQlUȭ\il&HN\KSpyƕ*e)h^o4dflir!{4؇fR3!i/8-f ˣR ^^^hs>[5WG-=\-hi\FI<6(]tYMK/i43ח -R׊jVHWi*JtNIJXLBaNd e{V^)+:4 W}H`(t{Mb^#\ gv?lO"uc JFDP]PMST c?+7-/4{['6ܷBп;h~GgᯐsBʛipߎ;/ҭAσf>klfDs2h UH$Q2™ " ] f$(; RP~2^ʹD fUŠUM> xq^;椤جtef[r*iaIErJ]I4)ifU>iMv,=/KgRWoy$oyF` 6 Oxu}~z;K/B;`GSw׸>X";BNaz|鹧fq ]:s[ ҍ%oZ#\ 2CJ;m4z4ڀo*Q sôVOu.?'铭ʌAޅ;Wsg(pBց;u w$PA'\$H# I \jpQar/K"E$y"Z=d4]ݢZ ·xD@$j(`Nf}4k Ǻuj3EZmNQb$r"#S Ϙ*bȒjL z\DYx@q-4:#-hPP](kRBiN d9!q¬95gAKvzngqm Byy+cxnggpZ :NK!d1 -լ#f!k)=Qtx.h){agu2,xxL셖2լǰ́kR*P hd&f |-fmh4f5,qSXkF4Z_2FhZ!blO3&KBuXmu $Z3!RK tZ4n5jYs1ïڥ&y" e=xH#R\(ya qEupJTTe؃HZ ญl>qZ,8# Z ~%ͅ#8RG;6ֽ!ra+1ڴ̆13ȝE)(J,0-(CԲj5o.XMV6k>W;qrY{Y|`T .tqa,E\Dp̭!Eg e`e$e&Z.^H3%dJ0޽^a{D ;mr(p j-pbl9H#[>5ZNj.K b] \kw%5x ~ǯvBQE8ƔP#aY(eʰAX jApNBBuGi&yhϾx pD?*ׁu9j?/;t(->5T$ *Ԉ.p;;b//9* ,cFNhӅ1RIuPFD@,EJ?|!Tc $BiElb9/ڟE %gXPwHC\"BߐY)Re_cz $ DF\ ˙s ӸD-LeZraZlqimpxd'(-$yq"4[P 5wR8WXE N!8cU)zNqifOpև?s3yǏߟw0yZ޹\\ҸӡJ到ѓ3CƓْU woz^>=:Nk#!\Q2\Z?g: )]O̟Pppp3_X67rҗ\YκM+_.wfj^0z Ey0$p8Q(p؎5?#$jx2Q1?Y^Еri&?,e8NUbuf"22qY bAqI!Ji tJ,5O5 6%4EЃAM÷*b|EeP0o1 h`[hFT`>P=q ق&VÊI=!1# S80kPFC3T)9q01ɬR~J ͜H/r?\jyx%wƽn|my}+|-bW7ѭ-l$+~We6ay|y4|{>{xoDrf_yɜ!32W/of>5r%ՖQbW.pw([vE;/c_@ ٰ'DڨO~RS{?-9IJ%s AȦWXO=~'Z&ĕ*A!ck?YVrYJhbN6m&@܊uwKfJy$CA(lgsAޔåt\8/;w8@J ֜#L~FǽA*T紳ATu"q h˷ZMh'[^jnk pw\x}ˀHn-T鏿0a:bz+(:cYn6^1Q.< }ۇbyyh*V;O0{:>}3?>G&_NƧZR2j5 ?3.aa*Cn_u|afj't?c oGy<8rVY#zX@'!m#9wO`n] C4 SO>n)xX@'!mU5>[UOB8DkKAq[w뼏 ܦS %D2.^wB9&$3O*Ez3U9 WRSE1ATFGqzʻ&@| f@С_ӟ3υqMnP=@XɄ˴5T.sRc&st}V Xr2_7:yE8EʹSub2qD:ST60 l9T4p)aci^cFv)흶w VlR)YQ,t:hpxaP :=7%[dk]ya*8M7KpZ명 TC1ZshZ1I]6ݞQQ$Ih^׬Ul0zLuեҙzG̀ՁOzA;B2`Ʀ 2` .!)&?Goy7%L:n#]R'ۻn] C4SYO5<{7؛Dľ#Ļ16UӻEw׻u7 D8g]r*an3zmҫ:SvvyPeUOhL}9L\=NөK}V(tN6v Es|v"-t%]T`{enj]9*(7; Wƫg_Kg̖ +H07R?n~HDl^ҵĥ1 ;7jq-Vs.55q5R vnjs>R3x(=h4v}I?VR#JTjj('Y]jJQz(eq:3qK8@N9&2,N+~q:#8ݶ[E@|vgu@8G-cRlφis?VR3ct(RtB`(JjK~utHUGG1 Rҕ.KR*.N)1Qϣa]̾OzuUfNlERڥY:a2 H ܅>IAX($a]6rr^]圄躺%ʹB5(f>Qh0 p^˜|].%vkhURuz:Kp `~AYWڷbː?|VNHM^ u5^\_|媬]/ŏSRb)SRT>ՕCD`;UyCUw`HI>Z!2NC6 FR/]2юwL\\2AKl.,߽tm^-De7‚lnmpWLK%yUѱEb]ˇT;;:?0$eǺ?8P$}PAc/ܰ+JP;;s컧S {޿ΓǙ˞<ѬwdGh[h[h[ht"*#$K-Q=*.2(!RB T2@YpaUfB4bWۛO 2[&f%-EٺEV^C鲈~?:D͘tjyCXAY"YqD3uFb]f4Z)-qQW*T\=7tRhm3FVUXQ#`Tu˒a?~B U ߺ|„,7W ]W/zmq;WL1G^Iw]VDAe8\b"46DL@HRJƵV d)i27-<+kvIE{CU&FCHy~%0 JYRp%ι.*/d.u5ciJF0滯ϯҥɿc_ݱ?dձߺxX-4*|˧?$A"L&??aF~PW.t?_^'"8u,~̟?{b P7}RŒ?{v\ H1FUznv=ҼxBd`фp&"@Ofkepsce;<&)oxZG6>T1C&_[n}ʘfFq`#N(5{=T*Q#?~BnQ1ľ#Ļ16Ux0ӻu7֘Rf'UDŽ~[}Hm:%=L+CbEڊzMb.~@xf]gZ90?<@Ucu,HZ8P q_gqhT#dT8~\kI^zpEtp>:zo%:Y?5Mc ^]ٓ5H "x,hpldsTK`iLI*}DaWI z&Rjx|14CHr*GFf=QaFf2iPiT)Ɔcc*Y:"~ڂ.J(Cyc %wQ&xM|%KRwQE -Dc. 8۴Αh|㉐R95}Y $v: ר6x7bV-heK}Vڂa9Z tgu G6J7V@>RSQ˼?PJJ &DMZCD);P@)+A9RG"Jxnv 1>ݱ|r}t/kqYreh:T5A+쁻`54 (m:m|,{hÍ(]v;+}0 {CU㑽 wzsQF! {SGU-zٹ-H ֜#LϱTWaי<*-9ӣ-^jJZaRǞ^ [reO}+$IPX'?ζ1B#uE |rF \ C4 SO~nJ9yzczHxA'AmKyλEHz!)&qmIM"#zX@'!mU J`-Έw!o5Lyboߔyu&w)y(0 =ޟ~5_\lm{POw^Iʜ#J\_-TȲawb,XfW:,(pFi߅H=߮Acyyǧx wezK "ցwUaWQ7W)*')ĭZ|U v=C5s =.a c\ãQXYK4XoF+qh1J &KrSXʻ2¢g5B D=m{FO[5g׀e0@k0=fsf $fr T~ΈS`V&%v_c䄂Rs޼ 5BXİ=QW)#qerJgw@Ͻ;͂(QA2߈"?Ο*ϋ<QrJ~B>9D0lkw#9 TA'QmԘ[rjw!)|wࣟл :߈nS"hy+[)#=W5" yJȊ%ψ5B)3bE 5_-5 9#3b)0͑G)6=%JI}%u/R7Hm9OPѯQU(=$u[j( +l9{+fRa(R9QLJ nKFbAqΣy1JK}ݖ?Q?FcX³ k H Ϟ?ICͮFJe~܏yS7h1pjԗIH8%,08H<|\{B& b+! QDń4*RL?JcB%5oI[4y[IJp2BIڪeY CtkkFX"*EFPV)j-3JfwͮjG|ٕ[X]0vU LHת%l5r\:\!DtzZ;^ lu-dlp{Md3V\\/ X\Уp~Oq `mXX خc9p*X`զA?;|? 3|ت ^f1^8!F4x%-[^XYT2@m^l7|?dL;a{QN јA5(c X]~Cv->Wm!&=| R3&ԛt(^2!x#(.VcgddY1d&˅FCfBɂq ID#J #b_ \ :d3KLo):Dֹ=\*5UʜjRBXPTl!fYNeV^2R TY2PceiK*,WZ`EH(rJ!c,栌$8[2e@Z:8 #!,)gʹ+i+K&JLQ ̖ %Cm/?-`H ER8дj;d FYU-$n%j Z8M!rzs{$p}|H^,)pf0$oջY!rs@ȃlNb|cXV .积w?v㯽l* Q/Tp} OBbCG)P=.{hͲ=:Jy`4،^V·k?oj3{-+M-w+.Mܿ(7/_@wֺ aΒNxv9vː@0YwY]^j9d9f㱿R=G O!T/dA#Cx!"B>}}}WSx,}8-KP-F/,qu[r97'Z6$݊mwKѪ1ܕb҉IƎ[j"}߹"N=Rr/#u(:T'ԜjqGI|G F'>XWfWdΜ%0ݥm3- ,DxO-ƺˉ?vKKCq:}տռ_EϷŚQBT}x+2_$~ʿ/?r!j>^U㎓k*\=?zڐcp^t@m΁oa%ZOMTY\Wmл :߈nF_4{ S[M DhwKAtR&E}=wKn} C\\O7+M7t|n6 ;fU ׮n)HC"MY4N>o'n kD0VNl?|n~#yhFH3YډIFQr t¶I̔׻Puyt)=l9-u{דc Kck+ i6K Z*_" QqW,Tb7JK Tj?)G"gJWHYiSUE|-EK_UWl&LXݱdmC$uͳC%9@n2FS`,W9'ɌE)8Lq9Jxu^ST⇲z,߽G(ݭnۏeeoW<՝S:PLtbm/bh˼@Rʊͥ bjl&Pf)b)#Q*@)?Ftl)9'?lR*ܢLT& A*3Rg̒u1IF8~8CGaH8s@qV"9^V) ѦR>dYy0"ܢ|I@nyagkK&nxQJ2"8^Vﮗj"NLk{8 D{͌q690=Y#p98Y_ϼmġJB7vf{9.p%9T ònAkCJ=|$(1'GaG')m6H|k9\i\%  ŹpR(8b%a8]y GNwE㬻{ 8[?=:w YEf9Ƴ gQTxu]sQ^!FN6j\8rSͲ/*.*"U=Jt q jۓM"F SQFɫ膗;R ա8ΆвLwrrNgSԢy7!OFBHRÓ1m*a4('|r&aJn,8RuoxoS[MT#BkĹݰQJɒy@7Ǭq΀|k?)U$4!-Jڙ Ik 7Gٻƍ%W?mpbn,9fg)olِI߷mn5ERL<4Y]uuUnROW1R&)@β̘h^0Ve-}Di&S A]dRƴ0R"w7]M-=Smjê&Df=x#5X, #֐աq.)5$|pԖM1&d4ҩj1a9R,ɜ3c".4Z81ϟPjm'g@~h-3K,bS|[a51 ' 22MH\"fY̰2Nd*r}n*6土ll7ۗB28 ,(Bx/& N<[%4((Ql,y}CL.tR;n z^ºHxXJ$[ǚ ۟PB"G#)BbjS7 YI10 XTZh iTF=@S9MRv2'$hOuAm/yUv)h6L)_rtΆo($5S8\z~3{qRXR|lC>T;Z#&U X9s23tJ mp\D(jPG۝XJ淋bW#.t`D-dBec IaQ" ':yZB'SAV#Ɔ7C4}{ λQ\iywZ_t(GIMzEH "Ӹ0K,Ea$f'J`+Re$4f!Y%{[_CGu QB'K:`l>JQ-[ˏm0p-3{Mc~rlA.(z]rB8TlgЄcGVl? Q^ۨQMk O!^L;TGT4*ڏu3ćV X[{օ#lԥ1TX9s<8-kܪ5eC ~1/B'mƫC(qK)( &Vp0O.V5sKO2XJfXCI!/9B4sI#H&,9ޞpX܃2 Et*wTZR a11RH f\XH'qu*Еo4Il{@B]j׉u/˃gµpyiZ"Iϵ*"4&v$tơ!.$49cCh7fo _VBu,#w@cxQU`XfoIf<sq*"P 2beB$(h$#d9^R+~oj8aϫo~JHϻdYcEg럎&\ͪoX ~̓Ԕ[Z]@~ڟ]dlW:(%ʸ/ij5-˫(`'[`m0~sF|}$̀ZNk7=Ӆ;IXZ0?=GY(HQ'-?XAFI$( -TNIҔ'Js5Ӣܼ]@)Zj4AW:JEzmk{E(1K TJⒹW2_ZSX3üfq$bTr^Fk 1)v-BK >oPF*B(RNE6\ CߙjI8h<̓X\mH+??OzĢUv!TS!ij]{lyIoW{Md?{|@-KwZwoqSuf|?ju| m@nT9X vAm~_ȸ1XvҘ{?Q9(ͣjsyx0oV ruM]f!|QexB!b#=kTZO[(\ 6ed([ߧv otݺ@ȑC4SvHc8״!cnQu#Θ_\êf/u[mH/BSޟ>Qu#hSS6I%'):oKRvyCk25zmP@Fij*)I%:}Bz(EGJC(PijsJn#9seCnzڤ;Pzx(eF9+reL\5nzڤ"'4JAdij2 nzڤ:Rn( (PJ ղ6CD)cn(eJQ=mRBNF)Wn(ʛqt&Քz-Pzx(ĩ>gQJS=mRM=Rdn(E{v2*(P %՜;QJJ59 RҊjMO>amR}/%8 rpCiI5q2J%qjg>cNI5ZJP FRPn(Vx:R"ơK)uCiEVF)sCOT\6#XS8$ "M7|㱅F6;ee%O\;uvQ=^8|E6="*2gܓ N{byz[yN@HKs-D1&ȸ]gZnrj7r8N&'$i6 tcȻMtmpD)DH4]%#8yY5REL^p,Q 46?g)ĂBV İi,/ 4",,޽4$!Ï!Ol, *KeƋ46o$/0&ijB"56RG6҈W4SaY̌Qyyrsm0S4+X='I\i #YD(ZmJ-xᛕ P Z0<a%%OeKub΃DrYdic$3Ԍy2T!l[t.M͇"~߫KV툗OOʓܕJ|z#)N=YE? Hq_Bdz_)} sJF2!V]βxPM.?JOc%\~?gxY6 Cf>~(w溠}5+R·ȀFn/CX$=_zȮY;P|qlfvl\)b<FY{#|NXXrw٬ɇ/WV|z?_d00ߋ^*ž!ɵOv2+š%DG:|,.&%E 2X-~&ME ӃܞqGCUWw"bC`!P z <:Cwl>,emzlX-:͢6CK|*)l9[d͗>}Dh>|̽lܻ듕}Of6ײ鯗>OÐW?yWgu.ʙXhKȁwNP:T O!R(X۞.)`N$Aq> [4 ~I.NL_Tb׵~=^گk-%orav&X>xIК׽S,(JOj5_J Y=L?AGt Ov~gmw*'[icEQR(Gp#e45Ҩf~g+"'2V:+DBe # M ww{M6AauL"99L( 'xN7.Z B &C(@K!s[X0.()u6 9F[9Hr}~`|A|x' ?Yτ- g | )4j3 SJ|Kժ5{Rn%Q~hgNT U;[uZIV.ʢ@uV>|m{oEJYx_miIDqyd,8w"5ٶϏ[ s?6ϙ%]i "<b PV=}|]H BWrw9PAA)΀+r((n]|<;ϖ3"K*VJ kaJ`Up$ð_[$RfKEu^*4`cd]&¯_W7?nf|UW~tru%iz~?So)Rը&\}.lV_ǽ˗QdʧǠZd$@53й\QÕ|cIGbP k(i lQHgD#vsKXuqC(\ ʌ K S^ƟD} pBlq9 nt_ @ QOָMBxI6 i8:};=<.(k6 5wLH;M09aE+{uUF58=)t%}8 `J Ls L42nU_&?y} Wx=&i2ƷUWcxz!lFmA5L~a2@]yӨ5ۍN]c3DFIO ; )u$G l -w$s*e cdHg”9U*τBF49Y9Id4w~N*5{T' r݀M t/ڑW r1pxP*x}e&beVr{gnM tRƻ/P @Wőޭ MMi R(վn j?=™f^yxk%8K0KAWWvmgw~y;<[puH)E5e8HfEi\aZo+2ytngM ALWZsǙ&A;WȤ-_7_;}:ۃ`Q% w+T`Ca v s2 kdݨsEÅ%ע^ީ>~j= 3G4Cu^OU nrt ̌cyƩЙ&TgbԜCtV{sAj91n7Ì<-BQNcO(:7B0g5-{v~H蔀Hϧ-7Zd J{>2_uЂNxmO+S&dۯ~5w\ EB)Ez="EMӤ&ϛ<cVRBvA)-QYNKq&S,% ;i{_wj\`$clNL?'r,--My%$c*EC.EVz7fZE~OG 8ܯa: չ)ݟ a*m)|$Sf𠈀 ]i_ψ֥+;=uږ?K&F3N}>"*:HÁhNp~('=JN?P$\ou]=YrnJ@~2˅e@jr®-TU]] t1Y4gdF.Ż`Qv~9k g ]PYPF-d–45e|[RPLH a4?U[kЙɺ'q5VJkAP%-tYR syA()'V:c)S"'oTߗ-* Ȧ3SnO H9e;1ELt\V*sn*'T_ɦD:I7\Tq.ZoB ^b#/0unٗ,H{zo\ղ{^]Lz]I:NVluw!V ;gb'$[fCT㒒ϯ?&2Sأ:RЭ[qҶݧhiUz7ҥm;0IK?Hv~}M[΀eǛG zT5 ap{Dm,U8XYz9M}?-6"IR'\:\h)sm BL}u C-%eD(M8Bϥ xA3<ʻ\:\q מynijP4RB^p ^N@ی@(Pʫ=-G5e8s^:lմ|UR 3y?u˶L(I8e !uqF;{0Ia3+5վ2U TP!d ^"VkڴXo T뭡Ƈ6>-2;CRZ|}6 ymRwwћԫ֣nI݆Q-4'ԈH5ڍb%*)I?C;E@x֣Ƃ9[Ѻu䴖(X|°j0VԧZ)l?e ;RJQ7gב?O UtwvH<9*,+7ڝK8w[*16휫WأytHև|&mS,Ҧ~'B}7ѡ-5"W,m!oX4>j%Be$h"ޚa^ 灒/KMO6D|$9u)[()!-Q%p(14RshU(,b JCڐJx 4A# ꈢ֯)bUsa8+4j)A)_yۋJYVZ$jaKc}Y,*>팅}GP oT8%/z{-ݭlMLwL䲖:hٲ1Mܽ*&|j۳ b4%UeDRaNkzf8j1KFH׉+^zŏ$[!]~JBKa4`83נ$e 0`( xA~4AYS3mwP%SOwsd?Bz)(sŢ(Eu4b1g`)>o2^5zEMuG-)teQV:f8G)-5DU&#/z[!ŗ%WD(TS"0  +=T+DR%-Sj%om!D&ӔqH{.5,_"~J|"c.b!j +kTCau:LecU\}kpdYcX*ZypXKazIteUIߍ^*#bT񢾿NщӇ?JJT^/S)UتHۼ|lk[nINʬcfc"׎v?y miKCI*@ 7[@ Af]VyTn yF+ (Zpi(tФKǚHL\0ǻ}y--ztmǕFVu9O:I$uAV Щ:II0,Hc&'|-e{6 c.us&MUdO`!6#<*D4yJ K9rL<Y:#cznj=ABIV//sE\Z/?߯wf%B0H{ PȀ#Pvuk|iݺ`3hNq.?߶nabCϨcݎF:5x8>=ҺugN "oc”,YUjR=t,2\}c),2 L,5KAű׷2R I}VVOfV'KE/FUX:F5> ,5l,%ci!5H1KR낾/҇>JXZM,#KRua4.^s%1񉥣f)8bXqKQűX:nc)a!ϪR#3ӮY"i*5*9q8;l,*Ԇi?nc)q X &Ԩ'?r R! 4'c Bj4DXJ~)c4ci!5&:T.VOK}VZ NHRq,Uja1KADZt%fq,#Kci!@1D$XJ~4ci!5)i, K뼾Xg[RkvRnXʋR) XMK Cȉf)8b_jp,E&T-'oW1**Ms%3rΈF04 uYp#U8P9]b*lT˛7 fߢzu{f>ُHzVT0..ogw._Z9-5CRϭ VN ܗ[Ug_u6j)u?ZnRJJ?mpv|TۚQ /GN%]Jkl/V+pF#[xֈL@*#^7iޢKcM [bU߬^3'S,4NjwWݍ[g枇dW)%j1voq=dKNQ zF)gGq*DP( цQO Vsp&{dGZʢZ _~^wJu5@Bu.W>P)B[F@6G]hHa՜#GM-6UVo-KT0F,5o2eRӔI_|fVsr}J}4׭kH% J_m֙qQEhE̓\r8)$Ґʄ+ܢz0Y)o!q(ۻ;K {ao}ٿg(y,\duqn&RL3U|Ny]QY(J2ŕBbuJ%9ؔ(ֿ$ fiPunȪ"$\)x,z{MnMzn 9 @s8{ &i\2 NS?P6ϯO)Y^]h;1 =φ 3 4?Eru׵y?Kc6!r?sOdvǛ_v/h$Њy-tu{5;o*\∰5ZS.V%NJJ'=l!ց䑋3!硎t985O7{7ֻ7 _C:$ૐ3~U1y-E>u95fjXd)^pjecƀqq`8I|.Ɖc(8Kބ뼮Hvo;)ėK\&M$ &ƾ^66ZZ*hXnAs Iհd8RKۨ/76x_,,'kXjc>ͪ0 %_],/??kpeKA^-YX/֏~^]g{őE|vJ (3~!=D-(fRH "3b4<"=^^Ánj+4/ӆwS.saCPiRr@LQdIbѰԂ4.2eZpS*<-W Qb]C-ErdVY%kH9'yO6I&)Vz7aė0;Kb&sLJ xtbM5̱iԛo8MOSeH$=}Е&1Ox|2=4?6?wRԞg<u =HQ*ڵ0bjum镩i8$5|>FG,T>sIAz/5=z:Q 7ER-ҍ@f^y77byfP<31KJ6r&w,wY歯ȵ.A[a #̍ژBcs4Y9!'ց̝vi5*q* }; %*PڢقFZ;fr74 HOࠨyiur+'1AJg4kR\36m("asx@- YJv HkLeM$n%1'MmUf F挓7~VO7v6PšZ_=~~s)iVW6#"kwwr[d/ uEgw.b$D6[-vLwyX,Ū2dz2N ɈٮN׆2XN8ג1wg*)# ҭUeVk4-v'q Up~fGݝ)ӌƥ^IG)و1R&8dP‡TլHqr0`.8v&dJ\S\&^x~6W1McXfy kq1tFht u*lSnp4>u ]"zy pu7z5JmbȾ205` ;G>Ha*vG-dGIOO?X:EyͿlǽAr \s܈S3`h`u9q7df]}'b`uM S lq][}jwkwyi]QCv{O$5PXۑ iQc6 `V<S9.V_뮰WK/TbN*䋖R'Eoc|Jw)7ʇ?~;ןZNf1:?凨MFf b/ՅScvՅz(T#j:,q<1S*;*N>ӡx']?PҖuݲZV h/BSokHK!Iġd}h4^c%RK9# =aȤ=?N;ՙg>} [Q(Ӈ=/h-:xO_2֗j`7[@;*mnó|^~.|1bj6֒}(Pe~+23[arO͵g}=oȟE #| >z׺ `BuBwXy_Z>[xܰu!9EC8%kwj!Z7SE9u Ձ uBcN걘2:u 1hݺ`wN֜2ݴuݶՙvyBcic1ͤ.1اKPE"1zҺHD:٤d J~#oD `y(C:yok;ݣ'HVK2Hİ|nyӈN}*)mvDA@~sDt?NDAKqpAxs^zp(QM  Ş|=a^[/S!v7ԫr?E$[f6mi]}~* FSw~O~:Ǽv]SeLe0"OE!/`Zj98_hlCCrpR ~H-iv\ Cs}B98fwA%̷c %>u Ձ uBcN t8֭̎ | ӣ8w[=Q [\vUnՉ[ )ښSH)YǑn[χ"]oSm7Z/6짷L"G1Ne{\TSD:ͭ&O1JbwF%-W%7?m`9\]Ww5P5%n2J4iE2lqJܚs%ٷJP䜨u-F捚R%QEq!Hs_)I3#[p,Wn BY+R|p_!GhԖMj@`58$їO>ovuҗO˗D@>z7Ed#"8D[y?=oȳ?R*??='B$SAfӽ/jRAA D,`$j(2pv!pzH)v<; (ݎQ<5, Pj''ſ Ԃ9O=t2Q肪U0Cg j@ #M^$ߖmߺcj:,;̚e?E6E^؋l#_^m*rڤ}Nlgԟ뫝U4kt^xf_c_?DmŦD)KNmnWsW \[ѻbQAW,;˳xw;Tiӳԇ]0wq*5dNrJR&g$V<#NpoζO[UãW5rȂr> ߶5Dn>]y :]jު Y߭ ϸ!2qt ]%3\1$Djݪdj>٭Os &iC*(RgqybHʩ0Y.2)~d6 lo~*!3Yʜp%#C_t3YW86 --qY[(?3I.Ԥ-oaV8!E?EHZ-;hD` c^~`ItL W9L *&fB!4Hk'4NI)j9 bR]9t$N@bq!Tf$o @'R+- OE,R$E1~*mD(ڜD=  Cy8PI" maZd TjDф,X nPShQ;jQOy-icKh,v"B3 w1QBI5 r`(0DShu}`PSʨKvb'qV'sj 6~tB73sn~Wb𢼮dwF=FÜ&H-?<2H͢?5=B^)7X68YUlK} o>װ ZqeGmmOK\gON?:*7ȋbOA&zHC9Rj=QmHMDq~RIw zSʥ8Es F kؖ>Q,1 k} :hG¾ mNz=. ]\]rVGf^jA ݖOXG_bSD:Wͭ&O1nBշ6Z.L|5QU*ZY>y~].>\ ǫ7Z}]EE2lQ<]G)gy:V9"JoBk[Uj,jM臫oUv @!U%P.Hs_)I3e䥰_mFdf ህޙ^JP˘z!I / GhK(Khcv= eFR~TyEu0zBScJFӹ$8wbY0s]MQ$@CZ.vQL- t6Օ 4;~$V1?!w,n7nޕy%U>:#Jg2G㰜0y﮶b;gk}*fOK4;ky-N(ltM_SL#xgXlD8d' ȷy!8z QRfV OeE 7*4y,"| (%b@V"SI,< Rvd( I\ص1׆Qi@M-wvv[j&36)Rnj4k})KK}-9Kݶl;ějFR*XZIYʔKB5qԷ;Rqayk7rxmi\F9T9Z_J(ݖW#KAd26 tci)5#_Xz,eRfW˔qKK7K--UіRs.3yT72P+: v[j./3YT#L.,u-=,ԂNyr7R^=혥ݖ1*.,=k2Rfwj,eڍY*g|ʝ(= A\cq,hYKK}-5\Xz,ujە&`B KϛԸR(XJKK9חf)'n,IJTݖK}Y=7KVOf!d3i%3f)%nvSTGݖS~Kϛ3>p> RRj.eWYʨK-Ϟ@UQpay)&lgYj$N1Q+]X 7%lgY*X n7 v.p)eZ? !TLiHZDjPD FY;(.WvS!$?XP^/8H3V DsyAʟWP׆ƨA%#%%j +,ʏW3PLPZ1e<\3 d 'f M K) Ҧ5dpRVЎ7gӰ]3*0p0w9V\k'9g=t|8Z3m|qŒ홫|.V|VE-kZF=_G,}s/Qq >ϟR9EUX~3ߍ?@ ڨuapZ[:o3 R`XB+72]U-ҶB"ɾ^I{1j6o1mWve/.i-sz_v#V]θׄY.v[/K;5[C>R3q'̺^Eoq\Ǩ3E噠1_h_ t Wh\hj=8btQhQN] c"c Uw[(W=aR/>A޵#EЗ>80`d2ag`,iLp-nI-VKCjY+s.9,Lzf:o<='0|ZsZv7e}囏Θ3i؊C|BaoCb9yXr_Es]>!whs4*sv%#[eI鲲}YnpUoL°2BhΣUUsϲ"E)^f%7sNnTE4/XgG3wA4;'0Rob\P&6uux̫'Rrp0|TWUj Ť4\*9\I%"\;+8be1UŊ $„6Bm&+*LBb,AZv)vN8S]dF1bh.S&ng2swdrH <Uu_Rtn{+֔. wO0_p`y1n?NF&$RCM>MM{nX (r>B)AOB$ y6eNO.:xBPMQ:=UXK9L(aZr i0$~߁ CVzC WӶb:5 # 0kY.OT99mLJњuZ l._z=ͽC;{D8[9hpS~ &92~*?wYH$OTt/y3O;̿qok` \m.*0?\S{1kd$x2"s7dn)@3އwk %b0Yb cY-Ͻ/O82zi֏lp5o XI$f2XUC݃Ro("U%gs|(BJ"ZcsׁٕSX:y~0xkjsS);aT.kp*qyۼ܁%U$I*%RQ)A{f\~S.)0n"O_!a{:2/ D>(&y@v@Ro$I |S7Ly8]4sm:<9R̹B1KDurD?"; |*0D̛D4;M_t Q٤m:UbC;x;;;WziѤg|o񦔠R2gȯΏ`Upkwe/`3]Hɻ⻽)8UaUZc:5~^6TH* ~\%6ͽ&9T؛c$gF)Ri(ļ˴#s.4sMfWX[ltMH}Y+1+¸OwlnHO@]Z:WbwcW|S>tμ]u ""A$)[ EuA'@f>蒪]D'_U˕ !Zno&ӿ/z=3NC$-@ulS^5p m7V1Brc<"Bw7]jPJKspWAyE{7 lΠ̓N/UKL&=1 D }K}n+9ePT҃[n=,'`(a0(O[ʓїnW5JUUc΁a)9Q[%>`Ζwl4abWнod7˕mfD Th }QzVBASPIo|?h EKWȝOc\fn_G;~QU 1Ҵ~6Ky$\bn 0: V 1R1*GV{/cS%-ͶP0my[VvTb΃MYF T1p#*nLYܣh\fY `ʤȜX 0ࡊJ/D*:L[3sOa2Lϡ#Hh@+El ']YtĚL>gdfTgO߆A/ 8|Uߨһ%9Vз5s1rmoE:J IYY D8J/ ;DJdi#L`Hy/'ynJ,oB/:MD%(U3 ~n*1:vw[̨DaΥuȌRrI,Rx@s&aF)`#`{yJ +i R w#pe,gY7؈#وSM(hRōղ, d/RV+3kSU6ND伜CgHS-7`fa/C7_uqo;2w?$ߟ$ (p'|N@|ųɕ7l:H,ttyqLƗŝ3/ˎ~g:Ob?EIX } ׫Ld{PfHA^KTi%˞30BfIK&Q¿$f"A\9bd$t+)"\PPap``Y Y*1 s0qrLj0N3msi|ԆҨCE~ .4+^Tf2B:ωPL ChS8G:T{X7 V3&RL_JDFZ0r .܃yb`OeTX Fg +t#&=EpHgb[ˬ4t^sڃFc6B<9rXf \5*ZMVMD/R* ^Q1ҚGe 6&1b)Ô l}:1 cTd!B 5\c37"%mn,*î [}fZ !nCϒTo)ћAM7V #LpV.`oՠ`L?r&Նi$eL2Ӫ5QBJ9cIX'`K0+m)g2f6K :8w^fnUޕ21Gz9$@`tdgURx& W{1e_HEoa'Pok؁Kxc~nMBAcΆlR%k3>Uw7wkW0h&hs'X Y ~I{TT߻|5~rAu.{g ӽq- :xR5ld~Bc,ؚZ#hKB>,2P8[Ȩ/BᣯG@i #R5#%)քmc0%A&Vu!"{U( <{*(,ݥ2Xyp:<ĝЭ 9ϑyf2!%ڝQ̙?93>JH>?Φn<科<g#}9_sk.:|n{[(3zgӟ UhBioE飣X=}aeqcX?wt'pA]Mtpu32ѡu<.x/ uc j 9ndHdOdd۽G:)fSj̗ ՞\nz)2o+U'2GB=ܸڬ#sLy7gR_tlGwſZY< U͌_^7Eڅ>&qkM'AI>|s,:qPdV,~0}1ß Uwcο~{A 7xo~M ^KzeJǬB:NqCe R@ϕ H󓃹,BTsa[ RM([T=rI_:}t-jDW<. qy`&`?MVdm5nf=gœ-K|0?f$[ޱ3njQ3=gr]٤󣘽5l'Ω %)gZ2_)g-QRZRIW)Ҏ?}tqJr%3GݜT12vL{}4tZjij`rJ}m-I:Sw+ F8gA#N՚ͫnTsۘ*SE, ϥD mܴӞ9[Bޱl'~mY|47s;3 >~ LAW'+qg>!tyH {wսI /0C"l  lOBqt߷(ɱ,<:GGJ@89 bULg8^-G")Km%{]s뽰e'ĔΙ~Ll1]Set"PWFf4Gז1 M+c0;<Xbzҟ-,{M%sE%:m ]( [)+OR#`9 Zt]'惻le:%^`x=EJpA-5 >l}ۻ| Z s߅u>xv|Ƿ.-`B-{SQ;OEEr ~QZ7/?:/?@X;rq]؋ƌ̇hÞ\ e9 +sAjowSiUH0GeL-'`)q% K^K'TMʜƱ.K*G_,&n=Єs0IRfoɼTL>ze.A$4/HYDã5\o F=+^O cPrIFC4jQ1{*ʁd6yAD_QIM#fN%`w|Fmt136I%Ư%_Ovc!a!fGZ P[j5`^{d LAk-,L4!>: EP0Yno̹c%n/umVKZzu8г fQ+ m3t[iktB+-S46v )[s}9PGOⱂwJJ,9Q#R2znDhɓ)-Q+ qP3I'aT|Yܨ،֨♯g=&P^<A(dt*YC-aNׂl| uapKٸ>HFۺö jZg (l0'WcøCg@COVgmj ;?A%]g tTW@G/n)n߲4X?=9Oc8J^j'4o`x Z:s">%eR7Pefr``mnΏu=:?N1^W;'kWgywbuz?o_NyUZ׭N]A=8 f~1;>Wnb+r˓W^ºwyZ11M1h_;}|>?+ׇh#Iȟ\DT7TW'MN#Zcݎ}OɬF6EtLI)OC~jlgm(-3;X#F=FNC[' j&W4U']ic w+]i_^n^}FK| !i59SOp*r@ЯV;֟W۲oz|A\AxJ.ES|*0 jWoΥ - c",a 2KM)5C?M76iu Uޯpx=-XKE[FQ^6^ fgO 'ν ]wN|t(-ORL5X HU(QH#Rw|6"5u)uCAE`4MQ-BgWJȘVD"^nڗƫb՛F~ Q䕙Y8-% + `yCi$cV4;yge+d] 7E[9)^Aє_,9}x{ Fd)\a8p{n)Td',TA}66uk IzVg*ݘ=+9^ųVȳ;C>\jys'LvМ[&w| CnC߱CLw^Y}xͩM4oO$>{l'o 4%B!]i||w}ӓitGѯBMj&eN}˻)OH/s :)؜*t@, I(ADCD'K^B| \#n&Ǝ |Lc:Qzް{;rǵ^3CMLh=2Kv-QO"ܨlϸ8";2GSfHg#_nF؏chȘ$`t$Lʱct@=4Q]8< fMx@Sac3m&N`۰~k8^sHhc&~Ayi/ʰu7}_$ynh9vq;pWI +|8kRSS}Ӗ>+71 E997e{O, " ȅOS!tNeGN=n z2ŽcmYG'](< YceaVݞJ_΃ *+l=1奻gHubcf@|c^Cf8,;qg%%&կHb;Œl 2O9CJ>_\$eҊy2 4:Vs1XYv8|Z;Ւ?X%%ƅۻK&Ƙdwٹ-Hd`a6 ò6;8ebR+ v(4l8.KZӻkrE]^S٧g֟\m4st<+"/Ֆ,[mFCx;2؈jt LWx1>O.92#;Y >{5jnF].R'W:r!?j4 us(ucbEpt-YCME$.{]*QXl5ɚ,g7v4SepMC!pzuJ0g+R@1@]W 0uўm76%*~%!(>BwQLn:m+p3H*`b=ɪd bDq*8[`;RZ\ᐺ]4Gi.'NΎ @nOC$ xY߱# '2.>fq2AG&ߕkWULʬeKL= }r(e`qiE2n8c}@\DfLJO&">#CFG\wFJ Bp~ob&jGrojrH,ш$hKiKj߃D밓zb$!]#n58%p13k X1 , p`4D!TIC|&s6{͛OU' @!;mg1eڤ-,HN>uFJ Xh#;oo.PXb,֐|w2^嗸[]pgd5u%\{1>\Tl!l(zg ru$4".S2*teCbMZUq`yyw`$V>W3\C3Ыnٶ1M\IpKL J&M< :e]Yab9pc$ئ>.nNZW'Wy[c=?KND5fR#G7j$d_nQ\shA-`ۻrs $anPTeK" d5d4>&e |qF`|#IBO42L) mHJ=Շu$ԡϡD;xUUץ`7vVL.1-UX\pV< dB &h %`WѪhUƻ4V3f:Tȑ0KJH9P5u٧Wzjt N_N21hAHC+֩PX= ?|-`[n7b&;zg(QZR<1pb@4s==1G׺~;z/fJS\4F40:P2<>1jh 0αpmO&%_H(qN "974<fqt/P%wiVXl^W{=)!-H,"cj @ч0 uE%aC Rq"Y,F9l9Oly} Z.}Z!B.9̲h 1sɥGH ut2 .P;a5sp;w/St^5m"l C QT"`cA7X=`RȃOhtB>k2C:#] 4IkAQowI3 `Gۍ[fHF( +1D5JeDR<-dUQe=1F_Pi}rj]rA m *=[A 90l (uWY0u,iTv_wƜ$zURFz-ΰBВ;vgɐfDE)xD9ZDY4@`q*'7gа̝&4*S0TJ 2F0PjwG wp͒_w/{5C9w Z79XA.y\)ZxB46$sXit8|u,w9:ZՑN7NuHDS Fo(Pu4"|u`@$Ec@ٮiJfF,]Ά͜r8Q$'P@&Yg; ܖ2V`@4dݻ9jY)PF{4 u@ʣPdo Q֍|7lܷmx>t]Oe%)ÌcRHc$(rSod"ؘq80H3&żM9n5S I*i"qNF PEr(!~ݵ٩N'>ˡٔ$ ܰΉJ[юR:I6ؔЗE-y?wɓ}ZTwg M[^ MRD2UoV>7p٭?> 2J@e7vpMG"2%e/:?kqώ @Ѩ0bQ)el'?6hJ~<]AxTk-f*#ŲI.s3k޲ TrWt@" Yhdou^DZ&?~e cVY(͟b$fT`AC1k>gR 2A|%ҐāawnZnݾbrS{} J~or]C5cLrAkd\ZpZ UۖvEҵBEtQq~Z~xaJGj7si"SZ([!WtrK55A`$16|HqOꨗ>/b:Hv=ǔs5~|-8'GG}&ɨ4J?:AՌ& Hҫ"}Tr713B+ԹoN~$ż,Umtp^#M('u_o#T(ot34RX(A[.Qo拻C->34> jwqK_M&8V~<\Jl<ݧW%pGg4dm&oXR>u\d0ڬxu#7RbbRS9+? x6r .EQU>8Idj6=߁ Bw2u&|[2H&Vy2(aPZcj2XFot]4l_r%:[#CL%R%7 DDGU($-]|AE$>P9bDh>ݖV^"m mk3,#eB'J%ܨJ w_`Wrn!X, DsI3@ULҮ+)m/mOc%,V qqL{[.Xc$V@G%9H%j~|Ɏ.sWfwmty? Jxq!hYo1y{m\Ghv1GIe,w//c=t,J`EqZ Aci2YeU{SIV"1#rbCQ`K2NKx>KO58UWB~k;Q>̲kK= [} ɛljN5n) Z%ltkȧY.xKt$J>F]WzIl T~ IbtUNV2z͙_b:r?@gi;aAy D `!> |(3 ] .Hd XI%IX61)CWp ]þ̗1\zO =,HNH9.2R4es=>7DT˅їڿ[9^c$2>#lTmCuT.SHWq3]ϼn٫p[7F댢 Zę  Fz*jҒk?ꐤv{3Z\t?3⳺K;//h^. /?:Dx5>[|[<揈.]) ~넮 Ey.etv3I'K%JChR| ^Gj\Ҥ]fU>KPwYM}r$c7k= UK$lC$6jo}t#33!KRTf>S#DJrHo:MaɞqKنvzIf_oxގNGQÍ-K#aȳ{v= OHQkH'uat7rq맱+t⼪˰oBGݕV~fRLB+~v^+ ve̪^R1.6V >-Vph'No1kdS,X"eM&.& *ӳ?wG!u3ϣVF&ȋ8J'γD#g3c$^.!u2"N1 Op ewnnv/5m zu~_xq"[!?xgg{Q_w{+G/;\lh6 Ŭ4h{XwqIPWZhQBX|lM?^=>dS}73/^EWa4_Yxj;X cYD=*X\- M\2!ܹqU>FZJRz5V.3:)I:x_3@!h4h":mcgno$7n]H_\D[w?nZҠal;n/X-j w@B"2dQʐ7\s(w#j{ HJN?{ڜ֑ ŧd7ss>dmUj&/ZH0 Y`  Hp8nb 'D<r˩y(ԝmM*DzƠQk8V?] [n{<4&qyZ}e7`-{c2tp@AAlP@QeXO4긽iieQZ W_>.>֠@IFdYDgזsY'MZerzuWcjR"ҾX#F%gh𒥸\!vaڰ7YAzYraUY$aEdDWM ۞e*qg%/ߍ6eHx|N_QOQ0®47r-9k,(yiM@)s +Op5jgT0@٪[sUɟ'f1h|@7x#%6J%2 ѫU)+1 SwsO)m59㗮N%UPK del9֐0ё x73> :A㏺q N,qM%aLp⓳D]N]nsQؽkVUT \ϺyRN(Dɧ{ʿ!GGJMի֯T_g鰷)%vr:}~utĸPB{2GdĞ뷦azxE RQSPW0?V9v/Y;*x ջ%Wᬎ:mWJ/||.Np 5]s\4:$L :u7osUU[;+{߮w+mi<sm%Wtrn :2/bC.D'&v3ߜUù:R#5_qosWv ߠ$T UL%14N ΖS(WFhyJ:SAqYk OZҾ{ͬTJT3L>O[;%idbzj[ݔ8#QBz LҧBaD` +񘜛 9>8%9&/52,(xF||d]\Br$5N_`i+|겖3G*9Pa|eJq.U6~osKʙPY*QԲTEUݓe7D;=(QqR>ϊ|dCFo(ea}Xbej_qh3 O6Y.cI2:S]QbhHo"3ayr(Y4yPbwrRJL1k\_i @)j&"1ҰEtʳte۷> tt]Yl3AbY<^!i+X@F3fW bD$*GJM{fb Ci;RK ȝ{3amFre_ ZزKR|¸kh'.e) 9V1 #en^Fi̾m'Xݕ+ i]:wW@,hRJk4/gbdmy)ڵkE 3 @tBjR)B+> $;dɑ>E;8F4.K!F fF= ؒ.7BuD@sA|}# DIAtatGeY-AD O҂F9ZC- :VqFZoVyv$sLFf$z((Q>Kxnl'[i9<7 HaqEgcp1rHTdĺ$ڷR4 ؚ`wiX6vL@-{!ťUYR"QuqkMJje4*916Rc0)Ur"mތPj^ά(dmv[2cvW5( jXgB1cUs^`9VїtX7;LI#nB}5WrƸFՈEG}2?j Pg#sfViɻJ.`{mͲMn|'t٨OG-=X~[i.$ʗݰRfV:u=^ZlK[K + yi5jfk =VVj_FM1u幦Kd~\ڃ/o+墙rQzN+]6Q9X^Zlhڢ~fږ?i`+3$XTE-c`bh9+TQo.޷ZˤN^9$&`0G)u4H:k%=7f9ue0 v#hQ U{YTHV V){lM:JNLz3߼i&qHm7U#_ ~/9műP i)5U%J|:4Gp230wNFfb3M|my%nFE8%,,٣9sOшL]co_4mɤapgX%T!zJgQ: V5r{=gUaZp!aVRׅIq1 C1[7~x}ۈܠtdAfZ[5HZAz|;0kjG):ez+yPG}W,`-//I#,#4qɰ1jSVb*p o"˻n9V+p8N(_1^f6ppreQؿӸ?tZT=_^MJ}W޷'IHx7}L]3ۜtzãG*H-b/DY R'V:2wIM&R>< 6in/@sk*dB) "[Sl|wI͘&Y 䟡#ՖttpS_UnOmP?wnݐz%j¸dnԚ\R<̪mm?C6LJCJYGPAm؉|jNeuY-`:Fto@yC`8HgԴu?/?U&1~?f,lxP4P\7'[PùZ'[ƠtȴvtT;P@!CBJ&)'Cv:il`a`#.Gݲ??[_=Ή/gKFoo\̅ 'ǩzEau6\Iֻ?^#'5v# ڇ['O Un$Rb4sUPZCݐ[l|8v/7@,YKb R\4݁D{PXZ18"QrLlh6ӂP(DPV=VEaKr.q"A2 XAÄI'#%I F% uzdc4 0C=y:>f'; OtaZ*6k7-AAڱ1Vow14e߬_vҭA/tN F+u 1 q!du&IX&\JIQ<P;YS!Tb|rce I `%6re>9kagk TG, OLleiOlp?L `Ĺ u&vtV2+cpRJZF#3KWrDcwh a6zs'縩WiYVl٫U~xW#9- ~]'kVO0fYgcJ,U-ןt6~lPAh $W,PjiD퍚kNy4+xOhC<{zm?OO_0`/u/xM,J=6Flܖ]HI"zoxǁI^X&Ոqޞ`D,ΝΟ~ɝw/ӢKss0ʝv(gB*]P{ەV7Y^B|'_qR}t?ojB>K\m߿~9"}(zGYLg`aBƂ&6dgϓ&a"Nz͊5Poqbt.~H\D{N^nvK.,pCn3m L)jڹ2ڹ"o}*췡Id/S R⪗9fuan*<4)³Z*H9K)/2R?W[SyP+.[J9I)o?r'!ˤZ L:o)-fBXFiӜE*1_g V6ӀDvEb.V.Ҳ xs#c R.QIj/Nfl+$ 9dc;W=,5L1ΈL/F±1p!~+q,w??]H{qd_J:2 CՉ7G?H⾧ֻc2``bZd^qVy;\o_*z# FO35YvfkҒQmcec  %Նw^E[ֽdfF3"w9CZj4_a8b_S/#/hfׅx`߱fRP#4h:=ϢaT{QQI8B+@M,2Zu""'I08^.uIgIхRGSMluHI@,D]1LEI% ˢI bQzew"WGaypkȞ$ Y IsCL;3% ϙEd]7P41 SDCQF-O(At]9׮jHbYaHKIrC'3Uc0CbbuTVCC i(]!&|?c}Xiސlh8x[;})I*€Cg+kVG%6m"[._.pe$AӭXƏI~mc,^6vndOܸH%k.4^11j~ XO 7fR!TbA]KxZqkTG}YCm5L 76ZS=$!#gG Šْ݉{,Iwh GkPRJbLRhic@ T)!TL-V.d3hنRX|CO8`y!(Nc YQِd\hU,1Y(@hN"KXԊ Rs _%EQ(9xE&KsUlVL,O$%^zB(rE]" gfR J40>6 H,NqZÞ.TH?1#9]',xl)v5oV᠃DUݚV?enTwA о@/_z/=y{]P+{uKpBr8@GUҥ~7ԵZn6S<ez0҉B35}6AĝhnjI b9E;9+yMcbo'gjʠl$C.P~es=@?,% zɼ[z+PVn aZ37~ҡnt&/YK AJW"U6j|`6c50O9W@Z:TďMs l:f֒*Vi~K.}7I.pa  G| =L=KU**cwfx+X۠x X)x2PQJiq҈j;^4ly%/YAߣTeF4cYK<~{β+ZĢϞ9S+ly3DI{Jt_䡆w_NrqэJ%\vC˂d>r -k~hc!8{q~TGXU烔HRLYKH,&@ .?s? g8;q|6@^SpdLn&CeǺV0zK%(仝QF_MD ]u:˶Ku?S2n}mt#]"hYY T9Bʚ[כ\.l'(5٢Q:Kp_$&toLtZJ`* TQĈHϯV(oʟ.CpUqޞ2 ,Suz{f~oA VU)+TUb{Fh0. gOT&#T%e=BmSm6j fe;`L;6vـv6{ \*aдMbodѭǗ8eSNJ@^"lʡQ?d1y؍Չ@TW tF,dDںʴ'Ιt:8Gndi`}obTNXYQpP)RK*T&r-A"$g82hYBUҩl8**l)Q}郚@\?i_IZ+ZOLweIz&X `{7|=yA)"eVYz_S !~@0H#'\Qޒh}$Fv.|DN6FmKxău`,u΄b ue9G5ضTlqJRGQS0I8,0.Jcv)X`78~ X:08Xƪhx`Ϛ{NNSq_seݝIC (Wm:L`TJt`qnI!@o<>Z?iWHr)^f $v'-k]}? ,c!D0f"9w'ۇXF1#s]~S{D##BڄYz''ޗBb]Z@t`ik/$߲O(h ! & &|6+1 {[y+\I#ԅB6ἷSLSM0! @}&T {Mz 8:`U/2)R eܢی ӊ¦8ԯ@Q#]@s lsdu2M%{#ȩO 7^M5&aQykre!*%Xӥ%w_Ldh6^0!B"B$J/ 9Bs7c#oqE7^|Vtv85#$1s9 '?gY|Vu¢&80#3y\R؉1xr°"@3-ZR)>uZMA#EfG`R6PCeXe*VѝcP` "묍@.mwŧꓛJE$asHK̘JӒ8y$ )!A(",V41g(!.AIJPKzO/ yBGۯc\_^4ۋX/ܟ@G鬪;O:>dU/ aLGC]Q)l#'?]O#6"VjVիwM<cחyov'zv[ La}=ʯ8x!0b~V_#R=ndVaP]|+X8 !G~əI`>0VaHP+/|vdvGK -jp} Xq0ɕ|Z-gC ~'}@ڂ)ހq+Urfrỏ}<.zL!Qp"תyQ¥=|eҤ_Ns{!< ([4%=f6w vNBFhBz`ddS6i'C:!c0cwYrWv&P'=\ ws%qr}i}_@ c9Cv_\GiA4L<Ǔr0:<#5(ҹh@;@؍L;ѽ{00oBԴdI&XzNL$[n2Ì1Lp.Rb(N/.1G;:7oH(8I%E(rIH4p_)dt觛) \(2_W" 0",`w7MY&KY4}?.=,&EiG- |rȇi;F)aJeh^t]FTGB"a 1082$% [XQ҄QKgW5zX_4 ?SZ)srAzTrhibCE4%&4 rDgqr~.av/l/X6oҔX\ۤIؕt}UOSx$&_$D8rZ4I4R mK})\ƌ \#, |vZLqIs؂6=<ȣҘ@Cix<71Ɔv6+$*A1CuB0lB`'S=Ȯpij@Vްh`3,X\?U@kXH)L1K{@wX:0F&"tPp v?![[@&0#dȖ.)="aUnIp`Y]` AR֖Uq{Ĥr!/_]cя"xs1v?U:_m NaD8hkPA8BrMi^;9R3KSbs>NΫo~_MVXalL8fWQKU&  甕rLFC!Jqm(D) ((kT|!T JJ  7Z?j0&bNipd-htmAOir`/@ )ⅅ❝zR Ot^9T|"gͥ%JZ1Sǣ%Ba&I"^} 4} iyM@Ʌ8W6ۄsq6yX餸(7//o?8T1`fkwW.Wc4F{)o?n`i؛ aJ~ UUo -^ ·ɭxӣ˿0X)F_涪w a\I ][j  @wHݡs4m0Fw~8*fl0RC)P O87`iٴ~O*߶*e+ft]/qtokϬV|4+Pq=8j{j/&ޥR7(_7 /wt,HB0ꎽK/m#ӝ4FlN9n5\vݺiiIL=Hf:,M4 ɍh\= vr~qKz!a]?Faҝ2R6k/e<{vW|}5[l~6#,jk% x*!G9wOGT)Q)okd%ͺ'ӔV3d,YE4zce)%FFy)(QL 4{Z^=Cfԫ^WZތ-Z]^n}\e $re_y>]@h] '8F2u4d8%c,r)Dkwum:>rAJTs 7{}6NS'4oKqRgfR'm[;H"5J?䰛O 9ы** e-yᗊ4PM!6;иl|W-GHSՎsQ4JMz?jԳ13/91]HflQSEVFTJ9T J[ch4kyG`?{Q@Z8ldո{Qm4O.ـJ4dm!ZkF[yb 2kMChD2g<'mE sFotu}K<]Yv u2HRG[-j/jxIvxVǵ*J<ʒ@[c1&gVnOfweя%9ˤ[{^:[˔NdI ,(Lq2Պ.v]j%8T9^\tBL nx\AQR9 mgzV_24Sk FDsi&  0Hjn3Aɘhȥ;J߫Ǧ[6HLbj=vNO.O]iTA 6V3Jx$iz<9P߱:XT(ˆc[ R:-`10dfl*y_'i:'H8%>s>܉z&Z14roMgN}$^v(Md`C^ oWlde >O<[nh!lF5''±# ;'luvzb;`1Qk*E+D C|P3,d?Vn!{DN|[-cg94Vl ͽExDFo8(;O{><(u,@zi ,/ktCYJ֋㷋d+}:t '=x,ƱSVrr|bL@5|RMO"+/T(H0N9<1@=6 ُ'u=cVUίdzds>Bbqy>b5^,eq`B+)X O6V|NKӤ͞|K1g;XA(>\Pxc'VmH:UA˕#꼨I|9Jf U";ܣ3Sxl1O8IEXh.Sp(Aꯛ@5ZVXLXmka9 8_~ؐFau2H EF) vCO03zA(\|>=dda DAIE =fm3;m3ۯʁ.˫m%-BYֺX|MP OV'k=Oxyao]iK!3&b#U4'ÞlMv|!zs>ճO}FW T:Ve)4'Fz,vpMǽUӜU5FFYcv"DYeJKΑn5 VK2_@[T O-|<#AifX.&/> /D=Cyg/]Mvχ٨|ulykg|;*[pv|r-ξ'Rvc_ 5=N,EڰIkt&nS:7`Ϛ|C]Wke$=Zeў2^a?o/ٶ ^.o+k!u{~g?*7Z2}oCib~kzNOz]b*7؇f#d[\⪼1FjAD݃=0B̽'j0Z5o\7h2A:,{sa!WC~vw? z >'˦7 c؏CBNb?~f_gԖzul\gdS[rS%綫zO^Ch2>2~> ]~Um2HpgC147WA:Eg` 堏 }ۯLYᥝzZ147WA:E=uAX_416dO6*lb;;2S}.lZ+8~NY + K5Z]d;7{!dÝnx<_˲EH1 A"rvrzۯz-3l.><c~YZMJ]]ƃ_GGJ9&|S+>bO,˪YkgUijlaVywlYJ.ܧHTպ(E  r Lm!t띟8-=h- 6*;Lt״f;ld@Hn0t1ϻ[&O*Dm r$KhA$-Z*GOx!܋@NOG{2zHԎBűNs\PJ`! vEklk(rtֿnk=Ӛ~*ݬm X-B9ϧB4.IG£9Oհ5A1?߆00X2O 82 S=N.1K.{kR[>"8֔h~rQ[4fZ_K<>;ˑczqU9VnPxKAxY<(4RѶ[mG=Q_xn#ۓdk6Zݷ~00c*ǝoHZ0tc+٢X/9sm[+@ %@mnzBO @J,0vޑ7J{sŦ<ш .bf`봚X[X3))<']ƚ 9Ga$Xe(`;mt͎"Dq ! J/{tFʀBsP,My^~$Ubv! )yIҖsl=߬?F?Iod/To2o,ܳ|~F2 h$'n=2蚾ݙzOOwplrR!ĪKRURթQ)%~(ǽ-ww_秧-EYÛrCd$M]@A+#Y؞3'lOGþp4f3TA{/&Jzn0q&Fx@)-`˲%hDg5Il84'C0T#ra[uV32!HA.=vĚ*QIrHn&8e˙Fk.lcd'՚:]YhD#dk0,)) G;H 1>ֱP#rfuh%N.N:վb5xײ<\/QHK\=qYMBP7Foohqȥ~sC4ڼ%Ğp4jAD y{J8TOQRGF ՁL[bf>ó f@Xip am)C+EX9>{IjjHxӊ?l;vIiBλwVEϕmpO.>}Jl]!D([# !T`/R0y ZU*KukQڷ]ߠ.hh4RdHmˍD]mkRo?M+TL]<Ю'xw}@jrPHy:'`+j]讗ųوNR>׮J*XjJיl9iӧtihe,$W8 W`@v{A?D̈ZkWNOEFKIIr#94 G#f[r^ʅ%ՄG;1sn_Ol좯A`ءc7\S%QWJPޗ;5txo+FkeIio/hAl&D0a #K) LR-DqjH_'Hu9пt %IssVL5`-}C뽓u?^rweN?Wo(WAPR ili M(F6ڔf82VD Iܢ\H{۶_i[)~zcEM4Ar~$7\R)l5@<9+#@?B!vx #'Kϴ'n B`1ʱbђ?-fGKG;,\h cp˸cBY^Qf#;L8dlbi1OP ~S08uMܚ/GcU[F kMc3Qpn!uUc_%:{*!>jd= ]_)CG.5eJJJxKx2@r$!Ovq}-ɕ\"c8+m96}1v JWDqi % n7V 嬂3O>! S۟'Ӝ4GQ[iT}pCm2ohUJgE݊?6yp~YͫTUC`CIF rcqw0I2ёk|.*fPxYƽkTYI ڞ?&{s662:Me{L U%SeygLBR9)'MQȋ?*kX/ u}OkN 㖹p+y۔?Yp&Q pr퉂Fфh'tw,>ۘ$'8 cEܤ1\F.33+Đ)*%$jpME.]cfrSNp DTR""&% %jTP*RX#; _7% bcx)^q"2nn_2qFnACݴ owgQ!a!$8Qۣ 2s:Z,e{^q.Q@0yg~w9̪80,^#0^lMz&H==a1cIw#ӟz%lM?p1t)=|\`/jc.){Ov= ww2눧"ZBRWt{s9>!$.m!VIu01rs Bߠ/V'ZOk~IΞwяF..=}Q̯<- =4P;9Wt\cG.G l@>WR_;yY~s.~GDXGuv+kdl^ֳ9r3L<.?D@\]L)Az6JEӉ_x0"p}|R]0*_@!ctjPs*fPbZ㒃9jeqF\-͡&TkKyqL&Wp(*\z%%7͡VHp>7-C 4E]^_>,Ff:*lv>{6Ows節vwu|EN TKƘWsL˳rNj3p 5rB.Z,SYWٕq[?3Ú_l}5]MMmXp <.?ey% y"$SzvzKfJ ]vRδ[xvkCB^ɔ`8,ņBi#:kn}F)ݙv ڐW.eJk*V U~pFNzGw/uȥjE;7e>oxZ{y٧= '%y*jFݵ"#@&J](}T?5{w0R<.xrݔ|wut] zwQg_7o0ZɫlYt=Z&'u?ʹMNg;K6BjKr8$><&1NAR;fP$!AH*L!Y K*䈵(vdQD)ǜ"ȧƑX$Ԑ8Ed9K,0acC_)TED1%L9&JY h畷$̈́b9 )[|M?75_ ^{.L?S`P38M:&R2ﯶx77*i螚g qt2gY-" Rx2g3}.&hMh6]R8k&<v\XHAHÑ1raE%2Q&Q:SIDr2z! pixJX0OhJZoߖarܮ0Ktc7<j'0GRJ>vw.yGR}]k~;E.$wW;\bkWU~nUuo|F+8*(B^jܩfSe[8_UKe {eB jV5G/eJB;_ihwړG;7&0r#%B,;5%&M<哌*6|8-C !>5T4gO 5hFԷ y""S -&v4҇v 脮QG~OL=[E$SSDk8o^/WuQUϱCk fX|:0s.n+u;- V9 Q.+Bhΐfﺁ|o4{ &;;F%Βk3)!HTEj.lm^R }Y#Za;ᦅR2K F[Doͥ@6ANaoKͭeL/Slen/+jX,)J,x뺒d,Ո z0*7[G0P؋9;{3?'V7>150C!7j>WWCzšRq9'[]'ZSVXNNMR׀W1 Be*0"$ 2H$1RI#ƱeOӄEjB5%Zizw\xוYy/݋vK4TM2Y24b$iRKbFD4eRI4J8f !g6[2H>џ9,~ipss7e?~W{'Jf/q*'HR0)K%,ƱbwM(IM΋:]YB'Jm܏X#[!GRۺy6QZqM_nLV6Ip [! 3R*Ah 0`*186`0J"\c@F ⲫ{[=Qb͊59xePm 7jbozqFyFjB1w*✕bD pw5hUc|d}l^*P4{fP9|vG~=Ksd~iM,ZPS`+Y \ †Py#+>j5*P@ߏήӆ 301k<-Q;(X86z y1 7~iWp0ZrXG_g[tG)<˶e5}.!whi}C~ ZZ쑲oIv_=+,.z{OKt;5wI XB [nor^|668!T5cC$E4kdh'R$ 6#m a4Y/d+p^ &vc`1E'} [xJ  4b zD Qc!HƂ1KuBTAcP3ģ 1RP{&IX 54Vg:~Fb cl3\jxנB€ )'i["Cs Øf6wfOx vn{IBv)A:6Jg/5ݥ_X8uGMBVwʁ#T]V]~];l PdNס͡B"j xL8* CjIEPk) C݁2Ԋ2V43FV=jU 7lۆr ).-v;^]ר? x*n|} !\DCd mgIav 脮QGEO)iڭ y"Z)Jc*qc 1yÎ1Tz"/',- bʬ̚{>re!L􈥔!y8sRe$G-XI)Nj!f('0O) 5)-I)NJz>]R=r)垺纔AH)ԥZhc;szZkqNJ 켆var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005541005515147020432017700 0ustar rootrootFeb 23 06:40:46 crc systemd[1]: Starting Kubernetes Kubelet... Feb 23 06:40:46 crc restorecon[4483]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:46 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:40:47 crc restorecon[4483]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 23 06:40:47 crc kubenswrapper[4626]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:40:47 crc kubenswrapper[4626]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 06:40:47 crc kubenswrapper[4626]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:40:47 crc kubenswrapper[4626]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:40:47 crc kubenswrapper[4626]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 06:40:47 crc kubenswrapper[4626]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.848988 4626 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852438 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852456 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852461 4626 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852466 4626 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852471 4626 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852482 4626 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852489 4626 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852508 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852514 4626 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852519 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852523 4626 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852527 4626 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852530 4626 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852534 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852538 4626 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852541 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852545 4626 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852548 4626 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852552 4626 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852555 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852559 4626 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852562 4626 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852565 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852568 4626 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852572 4626 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852575 4626 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852579 4626 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852583 4626 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852586 4626 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852589 4626 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852593 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852598 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852603 4626 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852610 4626 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852615 4626 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852620 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852625 4626 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852630 4626 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852636 4626 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852640 4626 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852646 4626 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852651 4626 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852655 4626 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852660 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852663 4626 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852667 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852671 4626 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852676 4626 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852679 4626 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852683 4626 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852687 4626 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852691 4626 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852695 4626 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852699 4626 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852702 4626 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852705 4626 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852709 4626 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852713 4626 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852717 4626 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852720 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852723 4626 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852726 4626 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852730 4626 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852733 4626 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852736 4626 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852739 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852752 4626 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852756 4626 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852760 4626 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852763 4626 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.852766 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852844 4626 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852852 4626 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852859 4626 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852865 4626 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852870 4626 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852873 4626 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852878 4626 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852883 4626 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852887 4626 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852891 4626 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852904 4626 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852908 4626 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852911 4626 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852915 4626 flags.go:64] FLAG: --cgroup-root="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852919 4626 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852923 4626 flags.go:64] FLAG: --client-ca-file="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852927 4626 flags.go:64] FLAG: --cloud-config="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852930 4626 flags.go:64] FLAG: --cloud-provider="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852935 4626 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852940 4626 flags.go:64] FLAG: --cluster-domain="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852944 4626 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852947 4626 flags.go:64] FLAG: --config-dir="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852951 4626 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852955 4626 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852962 4626 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852966 4626 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852970 4626 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852974 4626 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852978 4626 flags.go:64] FLAG: --contention-profiling="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852982 4626 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852986 4626 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852990 4626 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.852995 4626 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853000 4626 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853004 4626 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853008 4626 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853012 4626 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853016 4626 flags.go:64] FLAG: --enable-server="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853020 4626 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853025 4626 flags.go:64] FLAG: --event-burst="100" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853029 4626 flags.go:64] FLAG: --event-qps="50" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853033 4626 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853037 4626 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853041 4626 flags.go:64] FLAG: --eviction-hard="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853046 4626 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853050 4626 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853055 4626 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853061 4626 flags.go:64] FLAG: --eviction-soft="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853065 4626 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853070 4626 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853079 4626 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853084 4626 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853089 4626 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853093 4626 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853098 4626 flags.go:64] FLAG: --feature-gates="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853103 4626 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853107 4626 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853111 4626 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853115 4626 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853119 4626 flags.go:64] FLAG: --healthz-port="10248" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853123 4626 flags.go:64] FLAG: --help="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853126 4626 flags.go:64] FLAG: --hostname-override="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853130 4626 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853134 4626 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853138 4626 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853142 4626 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853146 4626 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853150 4626 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853154 4626 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853159 4626 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853162 4626 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853166 4626 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853170 4626 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853173 4626 flags.go:64] FLAG: --kube-reserved="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853178 4626 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853182 4626 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853186 4626 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853189 4626 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853192 4626 flags.go:64] FLAG: --lock-file="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853196 4626 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853200 4626 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853204 4626 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853209 4626 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853213 4626 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853217 4626 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853220 4626 flags.go:64] FLAG: --logging-format="text" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853224 4626 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853228 4626 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853231 4626 flags.go:64] FLAG: --manifest-url="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853235 4626 flags.go:64] FLAG: --manifest-url-header="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853240 4626 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853243 4626 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853248 4626 flags.go:64] FLAG: --max-pods="110" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853251 4626 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853256 4626 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853259 4626 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853263 4626 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853266 4626 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853270 4626 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853274 4626 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853283 4626 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853287 4626 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853292 4626 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853296 4626 flags.go:64] FLAG: --pod-cidr="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853300 4626 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853308 4626 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853311 4626 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853315 4626 flags.go:64] FLAG: --pods-per-core="0" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853319 4626 flags.go:64] FLAG: --port="10250" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853322 4626 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853326 4626 flags.go:64] FLAG: --provider-id="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853330 4626 flags.go:64] FLAG: --qos-reserved="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853333 4626 flags.go:64] FLAG: --read-only-port="10255" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853337 4626 flags.go:64] FLAG: --register-node="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853341 4626 flags.go:64] FLAG: --register-schedulable="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853345 4626 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853352 4626 flags.go:64] FLAG: --registry-burst="10" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853355 4626 flags.go:64] FLAG: --registry-qps="5" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853360 4626 flags.go:64] FLAG: --reserved-cpus="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853363 4626 flags.go:64] FLAG: --reserved-memory="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853369 4626 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853372 4626 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853376 4626 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853380 4626 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853383 4626 flags.go:64] FLAG: --runonce="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853387 4626 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853391 4626 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853395 4626 flags.go:64] FLAG: --seccomp-default="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853399 4626 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853402 4626 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853406 4626 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853410 4626 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853415 4626 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853418 4626 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853422 4626 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853426 4626 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853429 4626 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853434 4626 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853438 4626 flags.go:64] FLAG: --system-cgroups="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853442 4626 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853449 4626 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853453 4626 flags.go:64] FLAG: --tls-cert-file="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853456 4626 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853461 4626 flags.go:64] FLAG: --tls-min-version="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853465 4626 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853470 4626 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853475 4626 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853480 4626 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853485 4626 flags.go:64] FLAG: --v="2" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853491 4626 flags.go:64] FLAG: --version="false" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853512 4626 flags.go:64] FLAG: --vmodule="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853518 4626 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853523 4626 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853624 4626 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853629 4626 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853633 4626 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853636 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853639 4626 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853643 4626 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853646 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853649 4626 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853652 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853656 4626 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853659 4626 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853662 4626 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853667 4626 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853671 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853674 4626 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853677 4626 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853681 4626 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853684 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853687 4626 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853690 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853694 4626 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853698 4626 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853703 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853706 4626 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853710 4626 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853713 4626 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853717 4626 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853720 4626 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853723 4626 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853726 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853730 4626 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853733 4626 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853737 4626 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853748 4626 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853752 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853756 4626 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853760 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853764 4626 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853767 4626 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853770 4626 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853774 4626 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853777 4626 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853780 4626 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853783 4626 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853787 4626 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853791 4626 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853794 4626 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853798 4626 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853801 4626 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853804 4626 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853807 4626 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853811 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853814 4626 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853817 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853821 4626 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853825 4626 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853828 4626 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853838 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853841 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853845 4626 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853848 4626 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853852 4626 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853856 4626 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853860 4626 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853864 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853867 4626 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853871 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853874 4626 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853877 4626 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853881 4626 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.853884 4626 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.853898 4626 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.861034 4626 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.861067 4626 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.861942 4626 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.861981 4626 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.861986 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.861991 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.861996 4626 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862003 4626 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862008 4626 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862015 4626 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862020 4626 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862024 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862028 4626 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862032 4626 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862036 4626 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862040 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862044 4626 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862048 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862052 4626 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862056 4626 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862060 4626 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862063 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862067 4626 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862070 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862074 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862078 4626 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862081 4626 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862085 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862089 4626 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862092 4626 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862096 4626 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862099 4626 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862102 4626 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862106 4626 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862109 4626 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862113 4626 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862117 4626 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862124 4626 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862129 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862134 4626 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862140 4626 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862145 4626 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862149 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862155 4626 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862159 4626 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862163 4626 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862167 4626 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862171 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862175 4626 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862182 4626 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862190 4626 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862195 4626 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862199 4626 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862203 4626 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862207 4626 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862211 4626 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862215 4626 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862219 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862222 4626 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862228 4626 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862233 4626 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862237 4626 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862240 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862244 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862251 4626 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862256 4626 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862260 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862263 4626 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862267 4626 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862270 4626 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862273 4626 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862277 4626 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862282 4626 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.862290 4626 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862467 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862473 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862477 4626 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862481 4626 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862484 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862487 4626 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862492 4626 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862511 4626 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862516 4626 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862521 4626 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862525 4626 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862531 4626 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862535 4626 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862539 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862544 4626 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862548 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862552 4626 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862556 4626 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862561 4626 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862567 4626 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862604 4626 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862608 4626 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862612 4626 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862616 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862620 4626 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862623 4626 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862627 4626 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862630 4626 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862633 4626 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862637 4626 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862640 4626 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862644 4626 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862647 4626 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862650 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862654 4626 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862657 4626 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862660 4626 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862666 4626 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862670 4626 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862674 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862692 4626 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862696 4626 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862700 4626 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862704 4626 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862707 4626 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862711 4626 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862715 4626 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862718 4626 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862722 4626 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862726 4626 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862729 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862733 4626 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862736 4626 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862739 4626 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862751 4626 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862754 4626 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862757 4626 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862761 4626 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862764 4626 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862767 4626 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862771 4626 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862774 4626 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862778 4626 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862781 4626 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862784 4626 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862789 4626 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862792 4626 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862796 4626 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862800 4626 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862803 4626 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.862806 4626 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.862811 4626 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.863330 4626 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.866959 4626 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.867052 4626 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.868224 4626 server.go:997] "Starting client certificate rotation" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.868257 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.869158 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-07 06:27:13.099429153 +0000 UTC Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.869265 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.879701 4626 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.881478 4626 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.882358 4626 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.889240 4626 log.go:25] "Validated CRI v1 runtime API" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.906376 4626 log.go:25] "Validated CRI v1 image API" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.907711 4626 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.910846 4626 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-23-06-36-49-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.910881 4626 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.923556 4626 manager.go:217] Machine: {Timestamp:2026-02-23 06:40:47.922598784 +0000 UTC m=+0.261928070 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2445404 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f91cd4af-3be1-4260-a65d-11f80cafe5a5 BootID:f11baa89-c04c-40b7-af0e-799ac4cacb38 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:3076108 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ee:b0:2d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:ee:b0:2d Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:c5:0b:75 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:40:cf:b6 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:cd:23:3c Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:b2:98:e7 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:9e:a6:a5:c8:9a:7e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:bb:49:71:6f:98 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.923781 4626 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.924138 4626 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.924477 4626 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.924720 4626 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.924766 4626 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.925216 4626 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.925228 4626 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.925544 4626 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.925580 4626 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.926098 4626 state_mem.go:36] "Initialized new in-memory state store" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.926520 4626 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.928644 4626 kubelet.go:418] "Attempting to sync node with API server" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.928669 4626 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.928695 4626 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.928707 4626 kubelet.go:324] "Adding apiserver pod source" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.928721 4626 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.931444 4626 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.931963 4626 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.932052 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.932083 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.932131 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.932162 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.932796 4626 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933747 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933772 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933781 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933791 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933806 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933813 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933822 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933835 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933848 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933857 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933884 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.933890 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.934310 4626 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.934781 4626 server.go:1280] "Started kubelet" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.935074 4626 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.935428 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.935628 4626 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.935649 4626 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 06:40:47 crc systemd[1]: Started Kubernetes Kubelet. Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.937831 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.937886 4626 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.938301 4626 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.938317 4626 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.938352 4626 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.938454 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.938686 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 15:17:42.030510568 +0000 UTC Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.938799 4626 server.go:460] "Adding debug handlers to kubelet server" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.938980 4626 factory.go:55] Registering systemd factory Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.939056 4626 factory.go:221] Registration of the systemd container factory successfully Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.939455 4626 factory.go:153] Registering CRI-O factory Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.939548 4626 factory.go:221] Registration of the crio container factory successfully Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.939653 4626 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.939737 4626 factory.go:103] Registering Raw factory Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.939837 4626 manager.go:1196] Started watching for new ooms in manager Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.940084 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="200ms" Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.939890 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.58:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.939760 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.940417 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.940760 4626 manager.go:319] Starting recovery of all containers Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952773 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952821 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952834 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952845 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952855 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952865 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952875 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952886 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952899 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952936 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952946 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952957 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952967 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952981 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.952993 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953002 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953011 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953022 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953793 4626 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953821 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953836 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953853 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953863 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953877 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953887 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953897 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953907 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953921 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953932 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953950 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953963 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953974 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953985 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.953996 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954006 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954015 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954025 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954036 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954045 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954057 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954070 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954083 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954095 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954107 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954117 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954129 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954139 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954149 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954160 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954172 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954186 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954198 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954210 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954227 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954239 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954257 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954268 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954278 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954287 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954297 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954306 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954317 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954329 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954355 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954366 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954377 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954386 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954396 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954405 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954415 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954424 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954436 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954445 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954455 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954469 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954479 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954491 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954513 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954524 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954533 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954544 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954556 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954567 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954575 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954585 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954595 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954606 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954619 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954631 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954641 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954651 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954660 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954670 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954680 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954692 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954704 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954716 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954727 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954747 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954758 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954770 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954780 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954792 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954805 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954818 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954841 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954852 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954865 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954878 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954893 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954905 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954930 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954940 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954951 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954962 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954972 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954981 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954990 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.954999 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955010 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955020 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955030 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955039 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955049 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955059 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955069 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955078 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955087 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955097 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955108 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955118 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955130 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955140 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955150 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955160 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955171 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955180 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955189 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955198 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955208 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955218 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955229 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955238 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955248 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955259 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955271 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955281 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955292 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955303 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955314 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955326 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955353 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955364 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955373 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955385 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955395 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955407 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955417 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955426 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955445 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955455 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955475 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955485 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955509 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955520 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955531 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955541 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955551 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955562 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955575 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955585 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955595 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955604 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955613 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955622 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955640 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955652 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955661 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955669 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955679 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955688 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955696 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955710 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955722 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955735 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955754 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955764 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955776 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955786 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955796 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955808 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955818 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955831 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955841 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955854 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955869 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955881 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955890 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955901 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955911 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955922 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955937 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955947 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955956 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955965 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955974 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955983 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.955993 4626 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.956002 4626 reconstruct.go:97] "Volume reconstruction finished" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.956009 4626 reconciler.go:26] "Reconciler: start to sync state" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.967965 4626 manager.go:324] Recovery completed Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.976698 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.978796 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.978902 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.978976 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.978941 4626 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.980553 4626 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.980660 4626 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.980756 4626 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.980858 4626 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 23 06:40:47 crc kubenswrapper[4626]: W0223 06:40:47.981524 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:47 crc kubenswrapper[4626]: E0223 06:40:47.981744 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.982002 4626 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.982082 4626 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.982167 4626 state_mem.go:36] "Initialized new in-memory state store" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.987715 4626 policy_none.go:49] "None policy: Start" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.988472 4626 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 06:40:47 crc kubenswrapper[4626]: I0223 06:40:47.988522 4626 state_mem.go:35] "Initializing new in-memory state store" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.039031 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.044526 4626 manager.go:334] "Starting Device Plugin manager" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.044629 4626 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.044652 4626 server.go:79] "Starting device plugin registration server" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.044975 4626 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.044997 4626 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.045144 4626 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.045232 4626 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.045246 4626 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.050962 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.082147 4626 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.082248 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.083278 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.083319 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.083347 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.083570 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.083696 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.083731 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.085411 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.085574 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.085592 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.086931 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.086968 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.086987 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.087440 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.088263 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.088342 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.089306 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.089358 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.089371 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.089759 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.089954 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.090020 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.090892 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.090915 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091106 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091131 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.090928 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091190 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.090969 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091233 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091245 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091411 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091561 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.091591 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092053 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092077 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092087 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092206 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092222 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092233 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092303 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092330 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092960 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092982 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.092992 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.141342 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="400ms" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.145362 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.146407 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.146437 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.146446 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.146472 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.146882 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.58:6443: connect: connection refused" node="crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158018 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158052 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158078 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158118 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158182 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158218 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158293 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158315 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158344 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158391 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158427 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158450 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158469 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158488 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.158523 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259327 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259372 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259392 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259409 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259424 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259438 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259452 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259456 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259468 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259456 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259520 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259611 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259641 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259676 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259693 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259714 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.259755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260055 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260086 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260114 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260128 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260144 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260157 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260164 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260177 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260189 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260193 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260212 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260223 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.260213 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.347696 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.348794 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.348846 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.348856 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.348881 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.349389 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.58:6443: connect: connection refused" node="crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.418545 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.429282 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.438046 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-24016fc3a742f79c893f3a301633bd211d71db406a5516b32af3daecd7b14604 WatchSource:0}: Error finding container 24016fc3a742f79c893f3a301633bd211d71db406a5516b32af3daecd7b14604: Status 404 returned error can't find the container with id 24016fc3a742f79c893f3a301633bd211d71db406a5516b32af3daecd7b14604 Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.440978 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.443870 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-bca8f2890eb89ff8c2788371af686a73ad38dc8badb7479ffcb4e207855a048c WatchSource:0}: Error finding container bca8f2890eb89ff8c2788371af686a73ad38dc8badb7479ffcb4e207855a048c: Status 404 returned error can't find the container with id bca8f2890eb89ff8c2788371af686a73ad38dc8badb7479ffcb4e207855a048c Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.448659 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.451492 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7927f4c53137d36b5e3bfdbe7fbf4259789b5a1175c96f4dbc0f7e3632cf1a60 WatchSource:0}: Error finding container 7927f4c53137d36b5e3bfdbe7fbf4259789b5a1175c96f4dbc0f7e3632cf1a60: Status 404 returned error can't find the container with id 7927f4c53137d36b5e3bfdbe7fbf4259789b5a1175c96f4dbc0f7e3632cf1a60 Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.453805 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.460268 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4215da0c0216514ef8a01540f5e5161a90738d52684a92af9b320dfb828f5ed3 WatchSource:0}: Error finding container 4215da0c0216514ef8a01540f5e5161a90738d52684a92af9b320dfb828f5ed3: Status 404 returned error can't find the container with id 4215da0c0216514ef8a01540f5e5161a90738d52684a92af9b320dfb828f5ed3 Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.466375 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-400eaf3fe47566607a638aee59836871baa23db454dc56ac55f14fa3b10234ee WatchSource:0}: Error finding container 400eaf3fe47566607a638aee59836871baa23db454dc56ac55f14fa3b10234ee: Status 404 returned error can't find the container with id 400eaf3fe47566607a638aee59836871baa23db454dc56ac55f14fa3b10234ee Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.487825 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.58:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.542268 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="800ms" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.750221 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.751077 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.751107 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.751115 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.751136 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.751683 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.58:6443: connect: connection refused" node="crc" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.779518 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.779587 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.786537 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.786598 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:48 crc kubenswrapper[4626]: W0223 06:40:48.912648 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:48 crc kubenswrapper[4626]: E0223 06:40:48.912934 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.936970 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.939184 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 19:38:13.018984827 +0000 UTC Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.985875 4626 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6" exitCode=0 Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.985949 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.986031 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"24016fc3a742f79c893f3a301633bd211d71db406a5516b32af3daecd7b14604"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.986103 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.987167 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.987193 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.987201 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.989623 4626 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098" exitCode=0 Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.989666 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.989683 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"400eaf3fe47566607a638aee59836871baa23db454dc56ac55f14fa3b10234ee"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.989749 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.991507 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.991532 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.991540 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.992279 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.992328 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4215da0c0216514ef8a01540f5e5161a90738d52684a92af9b320dfb828f5ed3"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.993559 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145" exitCode=0 Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.993610 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.993627 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7927f4c53137d36b5e3bfdbe7fbf4259789b5a1175c96f4dbc0f7e3632cf1a60"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.993686 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.994417 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.994447 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.994456 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995165 4626 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4" exitCode=0 Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995189 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995208 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bca8f2890eb89ff8c2788371af686a73ad38dc8badb7479ffcb4e207855a048c"} Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995273 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995622 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995929 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995949 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.995958 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.996191 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.996221 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:48 crc kubenswrapper[4626]: I0223 06:40:48.996232 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:49 crc kubenswrapper[4626]: E0223 06:40:49.342848 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="1.6s" Feb 23 06:40:49 crc kubenswrapper[4626]: W0223 06:40:49.491574 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.58:6443: connect: connection refused Feb 23 06:40:49 crc kubenswrapper[4626]: E0223 06:40:49.491656 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.58:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:40:49 crc kubenswrapper[4626]: I0223 06:40:49.552260 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:49 crc kubenswrapper[4626]: I0223 06:40:49.553352 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:49 crc kubenswrapper[4626]: I0223 06:40:49.553394 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:49 crc kubenswrapper[4626]: I0223 06:40:49.553404 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:49 crc kubenswrapper[4626]: I0223 06:40:49.553429 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:40:49 crc kubenswrapper[4626]: E0223 06:40:49.553906 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.58:6443: connect: connection refused" node="crc" Feb 23 06:40:49 crc kubenswrapper[4626]: I0223 06:40:49.939407 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:57:51.352433085 +0000 UTC Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000062 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"25f967d40becf0537406ba63da0c005ab507e8b8bac92379acd1fd456dc18eec"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000111 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000115 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a2948892f6e9d19854d41fbe361ed6fa6b597f31a5a25a076d59e7378451a97c"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000232 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"236cad9b788f25859f86305a13d0a5c6a95cbb22da699039d92ddd53e3af3624"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000854 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000891 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.000904 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.002859 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"707c91827484728db1c6701877c4f25cc64278ca9305e843fe3f6fc91b495e66"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.002886 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.002900 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.002910 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.002919 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.003031 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.003789 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.003821 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.003830 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.004180 4626 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751" exitCode=0 Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.004234 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.004348 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.005196 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.005228 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.005239 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.005981 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.006135 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.007023 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.007110 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.007165 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.008387 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.008417 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.008437 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.008449 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9"} Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.008533 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.009043 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.009070 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.009079 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:50 crc kubenswrapper[4626]: I0223 06:40:50.939861 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 01:17:17.399917183 +0000 UTC Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.012558 4626 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9" exitCode=0 Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.012630 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9"} Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.012710 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.012785 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.013691 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.013730 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.013750 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.013761 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.013733 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.013837 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.154841 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.155649 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.155690 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.155701 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.155725 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.472378 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.472932 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.472977 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.473885 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.473914 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.473924 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.492849 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:51 crc kubenswrapper[4626]: I0223 06:40:51.940920 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:02:32.630155456 +0000 UTC Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019670 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286"} Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019715 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b"} Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019737 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38"} Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019748 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de"} Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019759 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e"} Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019829 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.019902 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.020869 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.020909 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.020922 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.021028 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.021059 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.021073 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:52 crc kubenswrapper[4626]: I0223 06:40:52.941859 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 09:56:43.485542651 +0000 UTC Feb 23 06:40:53 crc kubenswrapper[4626]: I0223 06:40:53.942486 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 17:36:52.221272858 +0000 UTC Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.321862 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.322074 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.322866 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.322967 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.323034 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.521780 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.522062 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.523261 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.523350 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.523412 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:54 crc kubenswrapper[4626]: I0223 06:40:54.942786 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 01:30:46.961559115 +0000 UTC Feb 23 06:40:55 crc kubenswrapper[4626]: I0223 06:40:55.116095 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:55 crc kubenswrapper[4626]: I0223 06:40:55.116236 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:55 crc kubenswrapper[4626]: I0223 06:40:55.117156 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:55 crc kubenswrapper[4626]: I0223 06:40:55.117187 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:55 crc kubenswrapper[4626]: I0223 06:40:55.117196 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:55 crc kubenswrapper[4626]: I0223 06:40:55.943091 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 01:12:13.033301051 +0000 UTC Feb 23 06:40:56 crc kubenswrapper[4626]: I0223 06:40:56.822360 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 23 06:40:56 crc kubenswrapper[4626]: I0223 06:40:56.822641 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:56 crc kubenswrapper[4626]: I0223 06:40:56.824142 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:56 crc kubenswrapper[4626]: I0223 06:40:56.824197 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:56 crc kubenswrapper[4626]: I0223 06:40:56.824209 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:56 crc kubenswrapper[4626]: I0223 06:40:56.943821 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:58:07.898699294 +0000 UTC Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.393044 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.393214 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.394180 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.394246 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.394263 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.804132 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:57 crc kubenswrapper[4626]: I0223 06:40:57.944244 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 05:28:50.885940157 +0000 UTC Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.032153 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.033003 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.033046 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.033055 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:58 crc kubenswrapper[4626]: E0223 06:40:58.051112 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.089908 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.090039 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.090871 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.090906 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.090916 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:58 crc kubenswrapper[4626]: I0223 06:40:58.944602 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 13:25:17.618559344 +0000 UTC Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.362308 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.362516 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.363446 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.363473 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.363483 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.369732 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.938064 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.945187 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 09:57:07.939211901 +0000 UTC Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.983642 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" interval="3.2s" Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.984690 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.987102 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.987460 4626 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 06:40:59 crc kubenswrapper[4626]: I0223 06:40:59.987535 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 06:40:59 crc kubenswrapper[4626]: W0223 06:40:59.993154 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.993222 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:40:59 crc kubenswrapper[4626]: W0223 06:40:59.994649 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.994723 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.995954 4626 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:40:59 crc kubenswrapper[4626]: W0223 06:40:59.996242 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.996282 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:40:59 crc kubenswrapper[4626]: W0223 06:40:59.999515 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z Feb 23 06:40:59 crc kubenswrapper[4626]: E0223 06:40:59.999578 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:40:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.003170 4626 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.003208 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.035523 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.036338 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.036369 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.036382 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.040316 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.210680 4626 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34908->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.210730 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34908->192.168.126.11:17697: read: connection reset by peer" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.393383 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.393446 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.938847 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:00Z is after 2026-02-23T05:33:13Z Feb 23 06:41:00 crc kubenswrapper[4626]: I0223 06:41:00.945931 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 23:04:23.882055442 +0000 UTC Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.039577 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.044531 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="707c91827484728db1c6701877c4f25cc64278ca9305e843fe3f6fc91b495e66" exitCode=255 Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.044692 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.044921 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"707c91827484728db1c6701877c4f25cc64278ca9305e843fe3f6fc91b495e66"} Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.045152 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.045405 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.045439 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.045451 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.046095 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.046127 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.046138 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.046590 4626 scope.go:117] "RemoveContainer" containerID="707c91827484728db1c6701877c4f25cc64278ca9305e843fe3f6fc91b495e66" Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.938355 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:01Z is after 2026-02-23T05:33:13Z Feb 23 06:41:01 crc kubenswrapper[4626]: I0223 06:41:01.946848 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 18:48:08.531648362 +0000 UTC Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.049162 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.049826 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.051840 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" exitCode=255 Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.051894 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e"} Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.051975 4626 scope.go:117] "RemoveContainer" containerID="707c91827484728db1c6701877c4f25cc64278ca9305e843fe3f6fc91b495e66" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.052174 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.053004 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.053033 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.053045 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.053543 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:02 crc kubenswrapper[4626]: E0223 06:41:02.053729 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.938737 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:02Z is after 2026-02-23T05:33:13Z Feb 23 06:41:02 crc kubenswrapper[4626]: I0223 06:41:02.947848 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:10:06.564773455 +0000 UTC Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.055123 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:41:03 crc kubenswrapper[4626]: E0223 06:41:03.186122 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:03Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.187283 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.188197 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.188331 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.188426 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.188531 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:03 crc kubenswrapper[4626]: E0223 06:41:03.191026 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:03Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.937982 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:03Z is after 2026-02-23T05:33:13Z Feb 23 06:41:03 crc kubenswrapper[4626]: I0223 06:41:03.948234 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 04:06:00.362335253 +0000 UTC Feb 23 06:41:03 crc kubenswrapper[4626]: W0223 06:41:03.991646 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:03Z is after 2026-02-23T05:33:13Z Feb 23 06:41:03 crc kubenswrapper[4626]: E0223 06:41:03.991756 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.324845 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:41:04 crc kubenswrapper[4626]: E0223 06:41:04.327741 4626 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.525822 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.525958 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.526981 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.527016 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.527025 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.527471 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:04 crc kubenswrapper[4626]: E0223 06:41:04.527659 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.529183 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.938286 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:04Z is after 2026-02-23T05:33:13Z Feb 23 06:41:04 crc kubenswrapper[4626]: I0223 06:41:04.948999 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 10:53:58.130740624 +0000 UTC Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.063677 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.064342 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.064388 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.064397 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.064842 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:05 crc kubenswrapper[4626]: E0223 06:41:05.064995 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:05 crc kubenswrapper[4626]: W0223 06:41:05.277271 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:05Z is after 2026-02-23T05:33:13Z Feb 23 06:41:05 crc kubenswrapper[4626]: E0223 06:41:05.277359 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.938465 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:05Z is after 2026-02-23T05:33:13Z Feb 23 06:41:05 crc kubenswrapper[4626]: I0223 06:41:05.949126 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 07:08:53.367106656 +0000 UTC Feb 23 06:41:06 crc kubenswrapper[4626]: W0223 06:41:06.283904 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:06Z is after 2026-02-23T05:33:13Z Feb 23 06:41:06 crc kubenswrapper[4626]: E0223 06:41:06.284005 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:06 crc kubenswrapper[4626]: W0223 06:41:06.300908 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:06Z is after 2026-02-23T05:33:13Z Feb 23 06:41:06 crc kubenswrapper[4626]: E0223 06:41:06.301017 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.845283 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.845468 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.846415 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.846463 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.846475 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.854214 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.938640 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:06Z is after 2026-02-23T05:33:13Z Feb 23 06:41:06 crc kubenswrapper[4626]: I0223 06:41:06.949738 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:06:14.432121209 +0000 UTC Feb 23 06:41:07 crc kubenswrapper[4626]: I0223 06:41:07.067336 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:07 crc kubenswrapper[4626]: I0223 06:41:07.068036 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:07 crc kubenswrapper[4626]: I0223 06:41:07.068066 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:07 crc kubenswrapper[4626]: I0223 06:41:07.068079 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:07 crc kubenswrapper[4626]: I0223 06:41:07.938305 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:07Z is after 2026-02-23T05:33:13Z Feb 23 06:41:07 crc kubenswrapper[4626]: I0223 06:41:07.950075 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:35:40.084094991 +0000 UTC Feb 23 06:41:08 crc kubenswrapper[4626]: E0223 06:41:08.051261 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:41:08 crc kubenswrapper[4626]: I0223 06:41:08.938172 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:08Z is after 2026-02-23T05:33:13Z Feb 23 06:41:08 crc kubenswrapper[4626]: I0223 06:41:08.950526 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 05:14:47.415878545 +0000 UTC Feb 23 06:41:09 crc kubenswrapper[4626]: E0223 06:41:09.589324 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:09Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.591455 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.592638 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.592674 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.592685 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.592712 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:09 crc kubenswrapper[4626]: E0223 06:41:09.594700 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:09Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.938534 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:09Z is after 2026-02-23T05:33:13Z Feb 23 06:41:09 crc kubenswrapper[4626]: I0223 06:41:09.950975 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 13:35:54.981261503 +0000 UTC Feb 23 06:41:09 crc kubenswrapper[4626]: E0223 06:41:09.987736 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.144894 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.145182 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.146073 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.146186 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.146250 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.146762 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:10 crc kubenswrapper[4626]: E0223 06:41:10.147004 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.394293 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.394349 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.938805 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:10Z is after 2026-02-23T05:33:13Z Feb 23 06:41:10 crc kubenswrapper[4626]: I0223 06:41:10.951209 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 11:05:33.370557344 +0000 UTC Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.493107 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.493369 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.494492 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.494536 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.494557 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.495036 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:11 crc kubenswrapper[4626]: E0223 06:41:11.495220 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.938116 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:11Z is after 2026-02-23T05:33:13Z Feb 23 06:41:11 crc kubenswrapper[4626]: I0223 06:41:11.951971 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 04:17:01.220281451 +0000 UTC Feb 23 06:41:12 crc kubenswrapper[4626]: I0223 06:41:12.937754 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:12Z is after 2026-02-23T05:33:13Z Feb 23 06:41:12 crc kubenswrapper[4626]: I0223 06:41:12.952050 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 21:54:55.904117729 +0000 UTC Feb 23 06:41:12 crc kubenswrapper[4626]: I0223 06:41:12.989472 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:41:12 crc kubenswrapper[4626]: E0223 06:41:12.992118 4626 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:13 crc kubenswrapper[4626]: I0223 06:41:13.938352 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:13Z is after 2026-02-23T05:33:13Z Feb 23 06:41:13 crc kubenswrapper[4626]: I0223 06:41:13.952742 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:04:10.629817406 +0000 UTC Feb 23 06:41:14 crc kubenswrapper[4626]: W0223 06:41:14.740182 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:14Z is after 2026-02-23T05:33:13Z Feb 23 06:41:14 crc kubenswrapper[4626]: E0223 06:41:14.740289 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:14 crc kubenswrapper[4626]: I0223 06:41:14.938059 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:14Z is after 2026-02-23T05:33:13Z Feb 23 06:41:14 crc kubenswrapper[4626]: I0223 06:41:14.953597 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 12:23:45.707437363 +0000 UTC Feb 23 06:41:15 crc kubenswrapper[4626]: W0223 06:41:15.779183 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:15Z is after 2026-02-23T05:33:13Z Feb 23 06:41:15 crc kubenswrapper[4626]: E0223 06:41:15.779275 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:15 crc kubenswrapper[4626]: W0223 06:41:15.847011 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:15Z is after 2026-02-23T05:33:13Z Feb 23 06:41:15 crc kubenswrapper[4626]: E0223 06:41:15.847091 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:15 crc kubenswrapper[4626]: I0223 06:41:15.938705 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:15Z is after 2026-02-23T05:33:13Z Feb 23 06:41:15 crc kubenswrapper[4626]: I0223 06:41:15.953990 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 22:01:12.21464191 +0000 UTC Feb 23 06:41:16 crc kubenswrapper[4626]: E0223 06:41:16.591833 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:16Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.595020 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.596561 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.596605 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.596617 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.596652 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:16 crc kubenswrapper[4626]: E0223 06:41:16.598843 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:16Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.937764 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:16Z is after 2026-02-23T05:33:13Z Feb 23 06:41:16 crc kubenswrapper[4626]: I0223 06:41:16.955223 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 05:04:36.806799695 +0000 UTC Feb 23 06:41:17 crc kubenswrapper[4626]: W0223 06:41:17.584073 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:17Z is after 2026-02-23T05:33:13Z Feb 23 06:41:17 crc kubenswrapper[4626]: E0223 06:41:17.584194 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:17 crc kubenswrapper[4626]: I0223 06:41:17.938440 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:17Z is after 2026-02-23T05:33:13Z Feb 23 06:41:17 crc kubenswrapper[4626]: I0223 06:41:17.955864 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:37:03.207904196 +0000 UTC Feb 23 06:41:18 crc kubenswrapper[4626]: E0223 06:41:18.051578 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:41:18 crc kubenswrapper[4626]: I0223 06:41:18.938604 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:18Z is after 2026-02-23T05:33:13Z Feb 23 06:41:18 crc kubenswrapper[4626]: I0223 06:41:18.956117 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 18:16:03.941551894 +0000 UTC Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.798143 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55930->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.798224 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55930->192.168.126.11:10357: read: connection reset by peer" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.798307 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.798551 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.800078 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.800131 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.800145 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.800741 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"236cad9b788f25859f86305a13d0a5c6a95cbb22da699039d92ddd53e3af3624"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.800932 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://236cad9b788f25859f86305a13d0a5c6a95cbb22da699039d92ddd53e3af3624" gracePeriod=30 Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.938468 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:19Z is after 2026-02-23T05:33:13Z Feb 23 06:41:19 crc kubenswrapper[4626]: I0223 06:41:19.957181 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 20:58:08.558693803 +0000 UTC Feb 23 06:41:19 crc kubenswrapper[4626]: E0223 06:41:19.991552 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.099749 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.100416 4626 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="236cad9b788f25859f86305a13d0a5c6a95cbb22da699039d92ddd53e3af3624" exitCode=255 Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.100471 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"236cad9b788f25859f86305a13d0a5c6a95cbb22da699039d92ddd53e3af3624"} Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.100561 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e"} Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.100687 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.101428 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.101470 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.101484 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.938135 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:20Z is after 2026-02-23T05:33:13Z Feb 23 06:41:20 crc kubenswrapper[4626]: I0223 06:41:20.957751 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 05:15:43.499983501 +0000 UTC Feb 23 06:41:21 crc kubenswrapper[4626]: I0223 06:41:21.938756 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:21Z is after 2026-02-23T05:33:13Z Feb 23 06:41:21 crc kubenswrapper[4626]: I0223 06:41:21.958151 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:46:28.243510744 +0000 UTC Feb 23 06:41:22 crc kubenswrapper[4626]: I0223 06:41:22.937951 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:22Z is after 2026-02-23T05:33:13Z Feb 23 06:41:22 crc kubenswrapper[4626]: I0223 06:41:22.958433 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 01:30:06.435416298 +0000 UTC Feb 23 06:41:23 crc kubenswrapper[4626]: E0223 06:41:23.594956 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:23Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.599157 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.600890 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.600944 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.600957 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.600992 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:23 crc kubenswrapper[4626]: E0223 06:41:23.603198 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:23Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.938766 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:23Z is after 2026-02-23T05:33:13Z Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.958829 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 15:47:00.524381864 +0000 UTC Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.981192 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.982313 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.982515 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.982602 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:23 crc kubenswrapper[4626]: I0223 06:41:23.983212 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:24 crc kubenswrapper[4626]: I0223 06:41:24.111277 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:41:24 crc kubenswrapper[4626]: I0223 06:41:24.937929 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:24Z is after 2026-02-23T05:33:13Z Feb 23 06:41:24 crc kubenswrapper[4626]: I0223 06:41:24.959483 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 17:31:41.623175477 +0000 UTC Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.116197 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.116349 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.117257 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.117303 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.117313 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.118546 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.119408 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.121296 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" exitCode=255 Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.121335 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7"} Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.121383 4626 scope.go:117] "RemoveContainer" containerID="85ede3ed51b526e59977117c44cc014928a5cd7749846d5fe66dc597228c9f3e" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.121582 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.125864 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.125897 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.125915 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.127827 4626 scope.go:117] "RemoveContainer" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" Feb 23 06:41:25 crc kubenswrapper[4626]: E0223 06:41:25.128074 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.938639 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:25Z is after 2026-02-23T05:33:13Z Feb 23 06:41:25 crc kubenswrapper[4626]: I0223 06:41:25.960107 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 13:18:41.042575322 +0000 UTC Feb 23 06:41:26 crc kubenswrapper[4626]: I0223 06:41:26.125828 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:41:26 crc kubenswrapper[4626]: I0223 06:41:26.938353 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:26Z is after 2026-02-23T05:33:13Z Feb 23 06:41:26 crc kubenswrapper[4626]: I0223 06:41:26.960902 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 00:15:19.197057743 +0000 UTC Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.393830 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.394033 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.395053 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.395163 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.395232 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.938685 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:27Z is after 2026-02-23T05:33:13Z Feb 23 06:41:27 crc kubenswrapper[4626]: I0223 06:41:27.961048 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 23:42:34.016994125 +0000 UTC Feb 23 06:41:28 crc kubenswrapper[4626]: E0223 06:41:28.051790 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:41:28 crc kubenswrapper[4626]: I0223 06:41:28.938363 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:28Z is after 2026-02-23T05:33:13Z Feb 23 06:41:28 crc kubenswrapper[4626]: I0223 06:41:28.961971 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 15:01:38.964251831 +0000 UTC Feb 23 06:41:29 crc kubenswrapper[4626]: I0223 06:41:29.070946 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:41:29 crc kubenswrapper[4626]: E0223 06:41:29.073834 4626 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:29 crc kubenswrapper[4626]: E0223 06:41:29.075032 4626 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 23 06:41:29 crc kubenswrapper[4626]: I0223 06:41:29.938951 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:29Z is after 2026-02-23T05:33:13Z Feb 23 06:41:29 crc kubenswrapper[4626]: I0223 06:41:29.962363 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 04:04:35.089924935 +0000 UTC Feb 23 06:41:29 crc kubenswrapper[4626]: E0223 06:41:29.994055 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.145276 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.145664 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.146946 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.147065 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.147143 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.147765 4626 scope.go:117] "RemoveContainer" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" Feb 23 06:41:30 crc kubenswrapper[4626]: E0223 06:41:30.148024 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.394569 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.394626 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:41:30 crc kubenswrapper[4626]: E0223 06:41:30.598406 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:30Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.603445 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.604751 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.604855 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.604926 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.605008 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:30 crc kubenswrapper[4626]: E0223 06:41:30.607091 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:30Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.938127 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:30Z is after 2026-02-23T05:33:13Z Feb 23 06:41:30 crc kubenswrapper[4626]: I0223 06:41:30.962593 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:21:05.126239696 +0000 UTC Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.493734 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.493897 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.495024 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.495073 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.495085 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.495647 4626 scope.go:117] "RemoveContainer" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" Feb 23 06:41:31 crc kubenswrapper[4626]: E0223 06:41:31.495845 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.938136 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:31Z is after 2026-02-23T05:33:13Z Feb 23 06:41:31 crc kubenswrapper[4626]: I0223 06:41:31.963570 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 03:17:14.751355696 +0000 UTC Feb 23 06:41:31 crc kubenswrapper[4626]: W0223 06:41:31.991577 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:31Z is after 2026-02-23T05:33:13Z Feb 23 06:41:31 crc kubenswrapper[4626]: E0223 06:41:31.991654 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:32 crc kubenswrapper[4626]: I0223 06:41:32.938621 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:32Z is after 2026-02-23T05:33:13Z Feb 23 06:41:32 crc kubenswrapper[4626]: I0223 06:41:32.964306 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 15:21:29.348733647 +0000 UTC Feb 23 06:41:33 crc kubenswrapper[4626]: I0223 06:41:33.938590 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:33Z is after 2026-02-23T05:33:13Z Feb 23 06:41:33 crc kubenswrapper[4626]: I0223 06:41:33.965150 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:04:37.260172182 +0000 UTC Feb 23 06:41:34 crc kubenswrapper[4626]: I0223 06:41:34.938949 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:34Z is after 2026-02-23T05:33:13Z Feb 23 06:41:34 crc kubenswrapper[4626]: I0223 06:41:34.966036 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 01:17:02.127034274 +0000 UTC Feb 23 06:41:35 crc kubenswrapper[4626]: I0223 06:41:35.938533 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:35Z is after 2026-02-23T05:33:13Z Feb 23 06:41:35 crc kubenswrapper[4626]: I0223 06:41:35.966250 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 15:09:50.671545258 +0000 UTC Feb 23 06:41:36 crc kubenswrapper[4626]: I0223 06:41:36.938806 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:36Z is after 2026-02-23T05:33:13Z Feb 23 06:41:36 crc kubenswrapper[4626]: I0223 06:41:36.967355 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 04:03:08.39395494 +0000 UTC Feb 23 06:41:37 crc kubenswrapper[4626]: W0223 06:41:37.045193 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z Feb 23 06:41:37 crc kubenswrapper[4626]: E0223 06:41:37.045272 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:37 crc kubenswrapper[4626]: W0223 06:41:37.443888 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z Feb 23 06:41:37 crc kubenswrapper[4626]: E0223 06:41:37.443950 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:37 crc kubenswrapper[4626]: E0223 06:41:37.601333 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.607422 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.608674 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.608736 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.608750 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.608783 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:37 crc kubenswrapper[4626]: E0223 06:41:37.610822 4626 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.938619 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:37Z is after 2026-02-23T05:33:13Z Feb 23 06:41:37 crc kubenswrapper[4626]: I0223 06:41:37.968113 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 05:29:30.52564073 +0000 UTC Feb 23 06:41:38 crc kubenswrapper[4626]: E0223 06:41:38.052695 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.094397 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.094786 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.095921 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.095972 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.095986 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.937861 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:38Z is after 2026-02-23T05:33:13Z Feb 23 06:41:38 crc kubenswrapper[4626]: I0223 06:41:38.968351 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 14:49:19.796994189 +0000 UTC Feb 23 06:41:39 crc kubenswrapper[4626]: W0223 06:41:39.100789 4626 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:39Z is after 2026-02-23T05:33:13Z Feb 23 06:41:39 crc kubenswrapper[4626]: E0223 06:41:39.100866 4626 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:41:39 crc kubenswrapper[4626]: I0223 06:41:39.938428 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:39Z is after 2026-02-23T05:33:13Z Feb 23 06:41:39 crc kubenswrapper[4626]: I0223 06:41:39.968473 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:32:36.774159746 +0000 UTC Feb 23 06:41:39 crc kubenswrapper[4626]: E0223 06:41:39.997406 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896ccf31bf0503d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,LastTimestamp:2026-02-23 06:40:47.934722109 +0000 UTC m=+0.274051375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:41:40 crc kubenswrapper[4626]: I0223 06:41:40.394302 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:41:40 crc kubenswrapper[4626]: I0223 06:41:40.394543 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:41:40 crc kubenswrapper[4626]: I0223 06:41:40.938883 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:40Z is after 2026-02-23T05:33:13Z Feb 23 06:41:40 crc kubenswrapper[4626]: I0223 06:41:40.969219 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 23:13:44.773166316 +0000 UTC Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.938174 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:41Z is after 2026-02-23T05:33:13Z Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.969681 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 05:42:58.811052075 +0000 UTC Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.981299 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.982524 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.982571 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.982582 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:41 crc kubenswrapper[4626]: I0223 06:41:41.983151 4626 scope.go:117] "RemoveContainer" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" Feb 23 06:41:41 crc kubenswrapper[4626]: E0223 06:41:41.983369 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:42 crc kubenswrapper[4626]: I0223 06:41:42.939216 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:42Z is after 2026-02-23T05:33:13Z Feb 23 06:41:42 crc kubenswrapper[4626]: I0223 06:41:42.970738 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 17:34:12.481014521 +0000 UTC Feb 23 06:41:43 crc kubenswrapper[4626]: I0223 06:41:43.938728 4626 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:41:43Z is after 2026-02-23T05:33:13Z Feb 23 06:41:43 crc kubenswrapper[4626]: I0223 06:41:43.972231 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:21:41.207974204 +0000 UTC Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.192039 4626 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.610893 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.612122 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.612179 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.612192 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.612377 4626 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.619125 4626 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.619212 4626 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.619236 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.621461 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.621514 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.621526 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.621547 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.621560 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:44Z","lastTransitionTime":"2026-02-23T06:41:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.629728 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.634581 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.634614 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.634626 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.634647 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.634658 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:44Z","lastTransitionTime":"2026-02-23T06:41:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.641622 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.646125 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.646152 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.646162 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.646175 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.646185 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:44Z","lastTransitionTime":"2026-02-23T06:41:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.652200 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.656751 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.656783 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.656793 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.656828 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.656838 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:44Z","lastTransitionTime":"2026-02-23T06:41:44Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.662805 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:44Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.662908 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.662936 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.763222 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.863727 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:44 crc kubenswrapper[4626]: E0223 06:41:44.964565 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:44 crc kubenswrapper[4626]: I0223 06:41:44.972728 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 03:01:03.625142143 +0000 UTC Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.064674 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.165700 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.266671 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.367185 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.467666 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.568436 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.669051 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.769735 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.870236 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: E0223 06:41:45.970627 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:45 crc kubenswrapper[4626]: I0223 06:41:45.973838 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:43:26.332613398 +0000 UTC Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.070921 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.172014 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.273122 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.373904 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.474612 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.575377 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.676313 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.777144 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.877988 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:46 crc kubenswrapper[4626]: I0223 06:41:46.974650 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:14:19.31078238 +0000 UTC Feb 23 06:41:46 crc kubenswrapper[4626]: E0223 06:41:46.978906 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.079027 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.180176 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.280916 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.381964 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.482792 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.583773 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.684696 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.785408 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.886298 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:47 crc kubenswrapper[4626]: I0223 06:41:47.975743 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 22:11:10.319599923 +0000 UTC Feb 23 06:41:47 crc kubenswrapper[4626]: E0223 06:41:47.986576 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.053107 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.087222 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.187834 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.290884 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.391753 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.492668 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.593536 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.694244 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.794760 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.895705 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:48 crc kubenswrapper[4626]: I0223 06:41:48.976123 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 06:42:01.748834891 +0000 UTC Feb 23 06:41:48 crc kubenswrapper[4626]: E0223 06:41:48.996481 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.097650 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.198487 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.299567 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.400123 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.501163 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.602203 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.702947 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.803629 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: E0223 06:41:49.903972 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:49 crc kubenswrapper[4626]: I0223 06:41:49.976386 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 05:51:36.889899277 +0000 UTC Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.004147 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.104820 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.192671 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:34398->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.192725 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:34398->192.168.126.11:10357: read: connection reset by peer" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.192770 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.192900 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.194199 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.194217 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.194244 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.194826 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.195011 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e" gracePeriod=30 Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.205275 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.305583 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.406309 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.506955 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.607190 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.707790 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.808323 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: E0223 06:41:50.909216 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:50 crc kubenswrapper[4626]: I0223 06:41:50.976975 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 09:19:57.388831731 +0000 UTC Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.010037 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.110840 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.193232 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.194144 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.194514 4626 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e" exitCode=255 Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.194572 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e"} Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.194604 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb3b6468591bc11855d004b0b5ae0f9292185085b3f843f96cbada8f30137baf"} Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.194624 4626 scope.go:117] "RemoveContainer" containerID="236cad9b788f25859f86305a13d0a5c6a95cbb22da699039d92ddd53e3af3624" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.194864 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.197361 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.197384 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.197394 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.211280 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.312116 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.413165 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.513447 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.614185 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.714817 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.815379 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: E0223 06:41:51.916318 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:51 crc kubenswrapper[4626]: I0223 06:41:51.977314 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 12:15:23.118440821 +0000 UTC Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.016606 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.117063 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: I0223 06:41:52.199447 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.218663 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.319575 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.420428 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.521147 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.621662 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.722172 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.822696 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: E0223 06:41:52.923646 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:52 crc kubenswrapper[4626]: I0223 06:41:52.977761 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:41:45.781191901 +0000 UTC Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.024070 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.124287 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.224372 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.324841 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.425941 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.527715 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.628232 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.728774 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.829292 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: E0223 06:41:53.930135 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:53 crc kubenswrapper[4626]: I0223 06:41:53.977878 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 02:04:22.111132562 +0000 UTC Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.031197 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.131840 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.232694 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.333566 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.434533 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.535158 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.635625 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.667992 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.677228 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.677363 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.677458 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.677575 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.677671 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:54Z","lastTransitionTime":"2026-02-23T06:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.691371 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.701592 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.701640 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.701651 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.701672 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.701684 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:54Z","lastTransitionTime":"2026-02-23T06:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.712494 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.717730 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.717763 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.717774 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.717788 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.717798 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:54Z","lastTransitionTime":"2026-02-23T06:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.726479 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.731542 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.731687 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.731756 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.731827 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.731897 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:41:54Z","lastTransitionTime":"2026-02-23T06:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.739301 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.739438 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.739483 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.839630 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: E0223 06:41:54.939876 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:54 crc kubenswrapper[4626]: I0223 06:41:54.978473 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:28:45.663653626 +0000 UTC Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.040956 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.116571 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.116716 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.117534 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.117577 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.117589 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.141201 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.241273 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.341954 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.442263 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.543045 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.643144 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.743638 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.843953 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: E0223 06:41:55.944515 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.979071 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 03:22:35.415514497 +0000 UTC Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.981436 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.982234 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.982259 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.982268 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:55 crc kubenswrapper[4626]: I0223 06:41:55.982867 4626 scope.go:117] "RemoveContainer" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.045111 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.145375 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.213127 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.215166 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c"} Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.215334 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.216132 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.216177 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.216221 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.246302 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.346842 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.447737 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.548046 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.648548 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.749460 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.850182 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: E0223 06:41:56.951180 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:56 crc kubenswrapper[4626]: I0223 06:41:56.979746 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 04:54:00.905246895 +0000 UTC Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.052201 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.153295 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.254424 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.355504 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.393482 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.393691 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.394949 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.395005 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.395017 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.398817 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.456398 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.556900 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.657418 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.758085 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.858895 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: E0223 06:41:57.959752 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:57 crc kubenswrapper[4626]: I0223 06:41:57.980100 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:03:30.400911228 +0000 UTC Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.053652 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.059855 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.160338 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.220875 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.221379 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.224298 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" exitCode=255 Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.224401 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c"} Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.224525 4626 scope.go:117] "RemoveContainer" containerID="54905aed939731aff6e835bc792447d489562abd2bb466de1417ba670698b1c7" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.224642 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.224930 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.228277 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.228329 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.228343 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.228682 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.228763 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.228780 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.230438 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.230727 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.261174 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.361781 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.462039 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.562578 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.663029 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.763616 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.864059 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: E0223 06:41:58.965033 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:58 crc kubenswrapper[4626]: I0223 06:41:58.980559 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 08:26:57.038517012 +0000 UTC Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.065555 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.166262 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: I0223 06:41:59.228998 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.266561 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.367052 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.467943 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.568546 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.669003 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.769864 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.870412 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: E0223 06:41:59.971533 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:41:59 crc kubenswrapper[4626]: I0223 06:41:59.980696 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 21:29:40.703857555 +0000 UTC Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.072150 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.144944 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.145402 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.146751 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.146798 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.146815 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.147551 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.147791 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.172368 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.273457 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.374358 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.475168 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.575424 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.675522 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.775646 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.876215 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: E0223 06:42:00.976938 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:00 crc kubenswrapper[4626]: I0223 06:42:00.981097 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:17:38.74201739 +0000 UTC Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.076626 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.077244 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.089137 4626 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.104141 4626 csr.go:261] certificate signing request csr-mcr5s is approved, waiting to be issued Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.112338 4626 csr.go:257] certificate signing request csr-mcr5s is issued Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.178148 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.278629 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.378823 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.479608 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.493869 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.494169 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.495465 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.495529 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.495541 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.496132 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.496319 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.580561 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.680820 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.781862 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.882767 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:01 crc kubenswrapper[4626]: I0223 06:42:01.981677 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 03:12:22.684988474 +0000 UTC Feb 23 06:42:01 crc kubenswrapper[4626]: E0223 06:42:01.983827 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.084361 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: I0223 06:42:02.113912 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-23 06:37:01 +0000 UTC, rotation deadline is 2026-11-15 13:02:21.929048465 +0000 UTC Feb 23 06:42:02 crc kubenswrapper[4626]: I0223 06:42:02.113941 4626 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6366h20m19.815110415s for next certificate rotation Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.185366 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.285662 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.386220 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.487300 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.588006 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.688925 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.789525 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.889914 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:02 crc kubenswrapper[4626]: I0223 06:42:02.982354 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:34:53.641084568 +0000 UTC Feb 23 06:42:02 crc kubenswrapper[4626]: E0223 06:42:02.990414 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.091257 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.191329 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.292150 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.392313 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.492621 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.593149 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.693725 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.794179 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.894833 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:03 crc kubenswrapper[4626]: I0223 06:42:03.983209 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 04:59:44.620019024 +0000 UTC Feb 23 06:42:03 crc kubenswrapper[4626]: E0223 06:42:03.995410 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.096183 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.196894 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.297649 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.398537 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.499423 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.600191 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.700465 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.801545 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.860175 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.865086 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.865128 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.865140 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.865162 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.865174 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:04Z","lastTransitionTime":"2026-02-23T06:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.874205 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.879810 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.879842 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.879851 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.879866 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.879877 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:04Z","lastTransitionTime":"2026-02-23T06:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.888432 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.894249 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.894295 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.894309 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.894330 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.894342 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:04Z","lastTransitionTime":"2026-02-23T06:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.903128 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.907997 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.908038 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.908051 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.908071 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.908087 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:04Z","lastTransitionTime":"2026-02-23T06:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.916088 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.916218 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:42:04 crc kubenswrapper[4626]: E0223 06:42:04.916253 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:04 crc kubenswrapper[4626]: I0223 06:42:04.983855 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 15:53:07.504857167 +0000 UTC Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.017271 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.117652 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: I0223 06:42:05.119259 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:42:05 crc kubenswrapper[4626]: I0223 06:42:05.119410 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:42:05 crc kubenswrapper[4626]: I0223 06:42:05.120156 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:05 crc kubenswrapper[4626]: I0223 06:42:05.120208 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:05 crc kubenswrapper[4626]: I0223 06:42:05.120241 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.218158 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.319085 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.419562 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.520420 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.620980 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.721735 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.822317 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: E0223 06:42:05.923336 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:05 crc kubenswrapper[4626]: I0223 06:42:05.984416 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:15:54.997768177 +0000 UTC Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.023609 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.124425 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.225426 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.325875 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.426703 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.527578 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.628221 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.729310 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.829986 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: E0223 06:42:06.931049 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:06 crc kubenswrapper[4626]: I0223 06:42:06.985276 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 03:51:23.705846494 +0000 UTC Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.031486 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.132122 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.232679 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.333992 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.434420 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.535231 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.636257 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.736925 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.837064 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: I0223 06:42:07.872046 4626 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 23 06:42:07 crc kubenswrapper[4626]: E0223 06:42:07.937714 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:07 crc kubenswrapper[4626]: I0223 06:42:07.985702 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:56:51.708894431 +0000 UTC Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.037998 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.054552 4626 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.139029 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.239438 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.340096 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.440549 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.541342 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.641964 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.742768 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.843823 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: E0223 06:42:08.944405 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:08 crc kubenswrapper[4626]: I0223 06:42:08.986878 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 03:22:39.559413132 +0000 UTC Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.045289 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.146162 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.246824 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.347381 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.447529 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.548345 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.649423 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.750344 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.851286 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: E0223 06:42:09.951816 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:09 crc kubenswrapper[4626]: I0223 06:42:09.987921 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 21:07:28.392835394 +0000 UTC Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.052528 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.153581 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.254145 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.354513 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.455103 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.556749 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.657808 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.758612 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.859369 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: E0223 06:42:10.960486 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:10 crc kubenswrapper[4626]: I0223 06:42:10.988395 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 14:38:26.210807129 +0000 UTC Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.061000 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.162121 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.262473 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.363401 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.464530 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.565180 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.665906 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.766237 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.867166 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: E0223 06:42:11.967610 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:11 crc kubenswrapper[4626]: I0223 06:42:11.989372 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 07:29:24.417102083 +0000 UTC Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.073213 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.173712 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.274585 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.375588 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.475853 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.576563 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.677615 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.778364 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.878559 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.979112 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:12 crc kubenswrapper[4626]: I0223 06:42:12.981601 4626 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:42:12 crc kubenswrapper[4626]: I0223 06:42:12.982688 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:12 crc kubenswrapper[4626]: I0223 06:42:12.982723 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:12 crc kubenswrapper[4626]: I0223 06:42:12.982735 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:12 crc kubenswrapper[4626]: I0223 06:42:12.983237 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:42:12 crc kubenswrapper[4626]: E0223 06:42:12.983430 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:42:12 crc kubenswrapper[4626]: I0223 06:42:12.990223 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:14:36.67609354 +0000 UTC Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.079409 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.180343 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.280874 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.380931 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.481473 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.581541 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.682273 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.782889 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.883900 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: E0223 06:42:13.984279 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:13 crc kubenswrapper[4626]: I0223 06:42:13.990591 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 16:05:31.045616802 +0000 UTC Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.084552 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.185424 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.286378 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.386702 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.487054 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.587670 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.687757 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.788422 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.889106 4626 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.908651 4626 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.977138 4626 apiserver.go:52] "Watching apiserver" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.980317 4626 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.980739 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.981254 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.981304 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.981348 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.981441 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.981577 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.981838 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.982116 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.982269 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.983373 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.983877 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.983916 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.984134 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.984369 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.984658 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.985066 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.985191 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 06:42:14 crc kubenswrapper[4626]: E0223 06:42:14.985616 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.986922 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.990662 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 06:53:39.052438661 +0000 UTC Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.991516 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.991559 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.991572 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.991591 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.991600 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:14Z","lastTransitionTime":"2026-02-23T06:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:14 crc kubenswrapper[4626]: I0223 06:42:14.991797 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.006780 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.015540 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.025311 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.034098 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.039436 4626 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.041312 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.048894 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.055526 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.062207 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.089535 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.089638 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.089726 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.089812 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.089891 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.089946 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090015 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090080 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090198 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090293 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090721 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090854 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091424 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091459 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091484 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091531 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091552 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091573 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091593 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091613 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091632 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091650 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091667 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091697 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091713 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091732 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091751 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091770 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091790 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091811 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091830 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091850 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091873 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091895 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091919 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091938 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091956 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091975 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091992 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092037 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092055 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092071 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092103 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092124 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092143 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092158 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092181 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092198 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092230 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092251 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092268 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092301 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092321 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092337 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092354 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092370 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090117 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090219 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090441 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090634 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090619 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090664 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090783 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.090912 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091348 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.091994 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092156 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092381 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092655 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092697 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092702 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092706 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092386 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092887 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092909 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092918 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092928 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092951 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092978 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.092998 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093018 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093035 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093058 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093059 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093076 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093078 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093088 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093113 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093105 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093101 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093194 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093198 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093267 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093277 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093280 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093299 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093326 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093350 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093371 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093390 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093409 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093429 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093442 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093451 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093444 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093473 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093520 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093546 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093572 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093592 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093616 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093638 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093655 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093659 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093674 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093708 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093712 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093728 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093750 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093758 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093768 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093812 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093838 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093872 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093895 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093897 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093920 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093946 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093975 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.093997 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094030 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094051 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094071 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094070 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094094 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094116 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094161 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094178 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094183 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094232 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094255 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094277 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094301 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094322 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094341 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094366 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094388 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094410 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094430 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094452 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094474 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094511 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094533 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094551 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094570 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094588 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094606 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094624 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094645 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094667 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094694 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094713 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094732 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094752 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094940 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094962 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095252 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095276 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095294 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095311 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095330 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095351 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095370 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095640 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095662 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095680 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095732 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095752 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096078 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096095 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096118 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096137 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096158 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096179 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096200 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096225 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096245 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096263 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096286 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096305 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096324 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096344 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096364 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096384 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096409 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096431 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096452 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096471 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096603 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096625 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096649 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096669 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096698 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096717 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096738 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096759 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096786 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096804 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096844 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096864 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096884 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096903 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096921 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.096942 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097152 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097170 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097187 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097206 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097226 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097245 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097263 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097282 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097309 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097327 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097347 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097364 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097383 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097401 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097421 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097440 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097476 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097526 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097553 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097583 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097602 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097622 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097649 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097668 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097695 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097715 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097737 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097756 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097776 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097793 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097833 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097844 4626 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097854 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097864 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097874 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097884 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097894 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097903 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097912 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097923 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097932 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097941 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097971 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097981 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097991 4626 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098001 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098010 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098020 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098029 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098042 4626 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098053 4626 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098062 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098071 4626 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098080 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098089 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098098 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098106 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098118 4626 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098129 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098138 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098147 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098157 4626 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098166 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098176 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098186 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098524 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094341 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.094629 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095192 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095221 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.095426 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097205 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097219 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097289 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097296 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097658 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097840 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097860 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.097995 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098030 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098038 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098057 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098108 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098385 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098398 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098439 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.098439 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.099681 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.099743 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104222 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100450 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100526 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100727 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100746 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100827 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100907 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100952 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.100996 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101355 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101449 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101546 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101668 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101927 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101935 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.101982 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.102360 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.102442 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.102521 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.103072 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.103288 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.103372 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104336 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.103637 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.103841 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.103923 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104575 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104757 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104799 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104907 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.104943 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105149 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105196 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105214 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105457 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105543 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105590 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105741 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105791 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.105861 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.106143 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.106265 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.106304 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.106599 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.106955 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107011 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107035 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107048 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107067 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107076 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107374 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.107914 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.108233 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.108639 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:42:15.608615993 +0000 UTC m=+87.947945259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.112783 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.112842 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.112762 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.114087 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.114396 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.114470 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.114485 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.114911 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.115550 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.115618 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.115840 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.116112 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.102213 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.116234 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.116485 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.116625 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.116918 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.116942 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117256 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117367 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117447 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117556 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117951 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117953 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.117999 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.118059 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.118194 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.118221 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.118369 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.118398 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.118434 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119163 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119293 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119278 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119589 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119713 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119840 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119879 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119982 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120023 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.119911 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120025 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120150 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120225 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120191 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120276 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120300 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120459 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120546 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120727 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120794 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120804 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.120812 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.121083 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.121101 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.121209 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.121485 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.121513 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.121870 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122152 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122275 4626 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122558 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122601 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.122629 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122795 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122847 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122919 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.122948 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.123112 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.123321 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.123572 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.124933 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.124955 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:15.624937541 +0000 UTC m=+87.964266796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.125031 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:15.625011811 +0000 UTC m=+87.964341077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.125055 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.125611 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.125620 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.125675 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.125762 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.124191 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.125636 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.126080 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.126280 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.126750 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.126745 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.126831 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.127046 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.127099 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.127646 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.141150 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.141262 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.141420 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.141643 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.141762 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.141822 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.142219 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.142345 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.142393 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.142752 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.143834 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.143856 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.143886 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.143934 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:15.643920822 +0000 UTC m=+87.983250087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.143998 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.144535 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.144557 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.144567 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.144619 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:15.644610194 +0000 UTC m=+87.983939460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.144625 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.146844 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.149206 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.149253 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.149274 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.149282 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.149297 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.149307 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.156131 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.157485 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.158846 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.158869 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.158878 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.158890 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.158901 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.165062 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.166559 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.168003 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.168026 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.168035 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.168048 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.168057 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.169785 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.174325 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.176767 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.176859 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.176939 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.177009 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.177064 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.183329 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.183429 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.198696 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.198798 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.198769 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.198999 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199073 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199135 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199202 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199260 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199318 4626 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199384 4626 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199440 4626 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199514 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199579 4626 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199648 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199710 4626 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199788 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199845 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199899 4626 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199955 4626 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200018 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200068 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200126 4626 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200183 4626 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200241 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200291 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200348 4626 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200408 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200454 4626 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200534 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200600 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200647 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200748 4626 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200810 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200864 4626 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200913 4626 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.200971 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201020 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201074 4626 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201134 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201191 4626 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.199028 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201257 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201303 4626 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201316 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201328 4626 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201340 4626 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201349 4626 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201357 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201366 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201375 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201384 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201393 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201404 4626 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201416 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201428 4626 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201439 4626 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201449 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201457 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201466 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201474 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201483 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201512 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201523 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201532 4626 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201540 4626 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201549 4626 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201557 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201566 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201575 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201583 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201591 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201600 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201609 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201616 4626 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201624 4626 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201633 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201642 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201650 4626 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201659 4626 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201668 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201677 4626 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201695 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201703 4626 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201711 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201719 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201728 4626 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201737 4626 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201747 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201756 4626 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201766 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201775 4626 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201784 4626 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201794 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201805 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201815 4626 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201825 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201835 4626 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201845 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201854 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201862 4626 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201870 4626 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201878 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201888 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201897 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201905 4626 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201914 4626 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201922 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201931 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201938 4626 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201948 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201969 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201978 4626 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201986 4626 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.201995 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202002 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202011 4626 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202019 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202027 4626 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202036 4626 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202044 4626 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202052 4626 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202061 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202070 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202079 4626 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202087 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202096 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202104 4626 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202113 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202121 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202129 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202137 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202145 4626 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202153 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202160 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202169 4626 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202177 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202185 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202194 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202202 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202210 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202219 4626 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202234 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202243 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202252 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202263 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202271 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202279 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202287 4626 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202294 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202302 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202310 4626 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202318 4626 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202326 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202334 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202342 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202350 4626 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202357 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202367 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202377 4626 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202393 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202402 4626 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202411 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.202419 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.208881 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.208902 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.208909 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.208920 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.208928 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.296457 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.302142 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.307004 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.310443 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.310459 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.310468 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.310479 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.310489 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: W0223 06:42:15.313275 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-47681f1c30c97b14f5f174d7da8d3509ffcc5326de44c432227166e944ec03f8 WatchSource:0}: Error finding container 47681f1c30c97b14f5f174d7da8d3509ffcc5326de44c432227166e944ec03f8: Status 404 returned error can't find the container with id 47681f1c30c97b14f5f174d7da8d3509ffcc5326de44c432227166e944ec03f8 Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.313297 4626 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:42:15 crc kubenswrapper[4626]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 23 06:42:15 crc kubenswrapper[4626]: set -o allexport Feb 23 06:42:15 crc kubenswrapper[4626]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 23 06:42:15 crc kubenswrapper[4626]: source /etc/kubernetes/apiserver-url.env Feb 23 06:42:15 crc kubenswrapper[4626]: else Feb 23 06:42:15 crc kubenswrapper[4626]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 23 06:42:15 crc kubenswrapper[4626]: exit 1 Feb 23 06:42:15 crc kubenswrapper[4626]: fi Feb 23 06:42:15 crc kubenswrapper[4626]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 23 06:42:15 crc kubenswrapper[4626]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:42:15 crc kubenswrapper[4626]: > logger="UnhandledError" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.314389 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.315461 4626 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:42:15 crc kubenswrapper[4626]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:42:15 crc kubenswrapper[4626]: if [[ -f "/env/_master" ]]; then Feb 23 06:42:15 crc kubenswrapper[4626]: set -o allexport Feb 23 06:42:15 crc kubenswrapper[4626]: source "/env/_master" Feb 23 06:42:15 crc kubenswrapper[4626]: set +o allexport Feb 23 06:42:15 crc kubenswrapper[4626]: fi Feb 23 06:42:15 crc kubenswrapper[4626]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 23 06:42:15 crc kubenswrapper[4626]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 23 06:42:15 crc kubenswrapper[4626]: ho_enable="--enable-hybrid-overlay" Feb 23 06:42:15 crc kubenswrapper[4626]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 23 06:42:15 crc kubenswrapper[4626]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 23 06:42:15 crc kubenswrapper[4626]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 23 06:42:15 crc kubenswrapper[4626]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:42:15 crc kubenswrapper[4626]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 23 06:42:15 crc kubenswrapper[4626]: --webhook-host=127.0.0.1 \ Feb 23 06:42:15 crc kubenswrapper[4626]: --webhook-port=9743 \ Feb 23 06:42:15 crc kubenswrapper[4626]: ${ho_enable} \ Feb 23 06:42:15 crc kubenswrapper[4626]: --enable-interconnect \ Feb 23 06:42:15 crc kubenswrapper[4626]: --disable-approver \ Feb 23 06:42:15 crc kubenswrapper[4626]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 23 06:42:15 crc kubenswrapper[4626]: --wait-for-kubernetes-api=200s \ Feb 23 06:42:15 crc kubenswrapper[4626]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 23 06:42:15 crc kubenswrapper[4626]: --loglevel="${LOGLEVEL}" Feb 23 06:42:15 crc kubenswrapper[4626]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:42:15 crc kubenswrapper[4626]: > logger="UnhandledError" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.318216 4626 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:42:15 crc kubenswrapper[4626]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:42:15 crc kubenswrapper[4626]: if [[ -f "/env/_master" ]]; then Feb 23 06:42:15 crc kubenswrapper[4626]: set -o allexport Feb 23 06:42:15 crc kubenswrapper[4626]: source "/env/_master" Feb 23 06:42:15 crc kubenswrapper[4626]: set +o allexport Feb 23 06:42:15 crc kubenswrapper[4626]: fi Feb 23 06:42:15 crc kubenswrapper[4626]: Feb 23 06:42:15 crc kubenswrapper[4626]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 23 06:42:15 crc kubenswrapper[4626]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:42:15 crc kubenswrapper[4626]: --disable-webhook \ Feb 23 06:42:15 crc kubenswrapper[4626]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 23 06:42:15 crc kubenswrapper[4626]: --loglevel="${LOGLEVEL}" Feb 23 06:42:15 crc kubenswrapper[4626]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:42:15 crc kubenswrapper[4626]: > logger="UnhandledError" Feb 23 06:42:15 crc kubenswrapper[4626]: W0223 06:42:15.318789 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f21f8ea22a0861fdfdfebcbe100346550d9513b70b4c4f2b8a56adc07fa89725 WatchSource:0}: Error finding container f21f8ea22a0861fdfdfebcbe100346550d9513b70b4c4f2b8a56adc07fa89725: Status 404 returned error can't find the container with id f21f8ea22a0861fdfdfebcbe100346550d9513b70b4c4f2b8a56adc07fa89725 Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.319916 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.321605 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.322730 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.413090 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.413113 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.413122 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.413158 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.413169 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.514838 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.514865 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.514875 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.514906 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.514917 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.616594 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.616639 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.616650 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.616667 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.616681 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.705976 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.706027 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.706070 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.706095 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706142 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:42:16.706100698 +0000 UTC m=+89.045429974 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.706193 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706212 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706326 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706352 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:16.706341033 +0000 UTC m=+89.045670309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706375 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:16.706363655 +0000 UTC m=+89.045692921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706417 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706440 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706452 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706554 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:16.706491046 +0000 UTC m=+89.045820312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706753 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706843 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.706971 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: E0223 06:42:15.707088 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:16.707071072 +0000 UTC m=+89.046400339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.719605 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.719639 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.719652 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.719666 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.719678 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.821862 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.821972 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.822033 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.822108 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.822181 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.924240 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.924361 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.924479 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.924577 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.924652 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:15Z","lastTransitionTime":"2026-02-23T06:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.985611 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.986141 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.987433 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.988065 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.989057 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.989601 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.990171 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.991107 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.991325 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 22:50:28.292185192 +0000 UTC Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.991711 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.992672 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.993144 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.994234 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.994709 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.995185 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.996080 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.996644 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.997516 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.997987 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.998490 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 23 06:42:15 crc kubenswrapper[4626]: I0223 06:42:15.999518 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:15.999979 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.001007 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.001514 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.002479 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.002932 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.003557 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.004792 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.005295 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.006273 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.006773 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.007589 4626 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.007679 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.009234 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.010070 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.010464 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.011817 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.012383 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.013273 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.013850 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.014756 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.015155 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.016059 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.016966 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.017924 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.018357 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.019200 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.019676 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.021016 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.021454 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.022243 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.022701 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.023529 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.024053 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.024489 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.028994 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.029050 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.029062 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.029080 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.029095 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.131140 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.131195 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.131208 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.131219 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.131227 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.233359 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.233402 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.233419 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.233435 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.233449 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.271257 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f21f8ea22a0861fdfdfebcbe100346550d9513b70b4c4f2b8a56adc07fa89725"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.272492 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"47681f1c30c97b14f5f174d7da8d3509ffcc5326de44c432227166e944ec03f8"} Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.272865 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.273934 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.273943 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"52dafc4c7fc777a640c4bcc7104e3dcd56c6f5f412a8227366c0903ff10f4c36"} Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.274446 4626 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:42:16 crc kubenswrapper[4626]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:42:16 crc kubenswrapper[4626]: if [[ -f "/env/_master" ]]; then Feb 23 06:42:16 crc kubenswrapper[4626]: set -o allexport Feb 23 06:42:16 crc kubenswrapper[4626]: source "/env/_master" Feb 23 06:42:16 crc kubenswrapper[4626]: set +o allexport Feb 23 06:42:16 crc kubenswrapper[4626]: fi Feb 23 06:42:16 crc kubenswrapper[4626]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 23 06:42:16 crc kubenswrapper[4626]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 23 06:42:16 crc kubenswrapper[4626]: ho_enable="--enable-hybrid-overlay" Feb 23 06:42:16 crc kubenswrapper[4626]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 23 06:42:16 crc kubenswrapper[4626]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 23 06:42:16 crc kubenswrapper[4626]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 23 06:42:16 crc kubenswrapper[4626]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:42:16 crc kubenswrapper[4626]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 23 06:42:16 crc kubenswrapper[4626]: --webhook-host=127.0.0.1 \ Feb 23 06:42:16 crc kubenswrapper[4626]: --webhook-port=9743 \ Feb 23 06:42:16 crc kubenswrapper[4626]: ${ho_enable} \ Feb 23 06:42:16 crc kubenswrapper[4626]: --enable-interconnect \ Feb 23 06:42:16 crc kubenswrapper[4626]: --disable-approver \ Feb 23 06:42:16 crc kubenswrapper[4626]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 23 06:42:16 crc kubenswrapper[4626]: --wait-for-kubernetes-api=200s \ Feb 23 06:42:16 crc kubenswrapper[4626]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 23 06:42:16 crc kubenswrapper[4626]: --loglevel="${LOGLEVEL}" Feb 23 06:42:16 crc kubenswrapper[4626]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:42:16 crc kubenswrapper[4626]: > logger="UnhandledError" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.275542 4626 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:42:16 crc kubenswrapper[4626]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 23 06:42:16 crc kubenswrapper[4626]: set -o allexport Feb 23 06:42:16 crc kubenswrapper[4626]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 23 06:42:16 crc kubenswrapper[4626]: source /etc/kubernetes/apiserver-url.env Feb 23 06:42:16 crc kubenswrapper[4626]: else Feb 23 06:42:16 crc kubenswrapper[4626]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 23 06:42:16 crc kubenswrapper[4626]: exit 1 Feb 23 06:42:16 crc kubenswrapper[4626]: fi Feb 23 06:42:16 crc kubenswrapper[4626]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 23 06:42:16 crc kubenswrapper[4626]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:42:16 crc kubenswrapper[4626]: > logger="UnhandledError" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.276522 4626 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:42:16 crc kubenswrapper[4626]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:42:16 crc kubenswrapper[4626]: if [[ -f "/env/_master" ]]; then Feb 23 06:42:16 crc kubenswrapper[4626]: set -o allexport Feb 23 06:42:16 crc kubenswrapper[4626]: source "/env/_master" Feb 23 06:42:16 crc kubenswrapper[4626]: set +o allexport Feb 23 06:42:16 crc kubenswrapper[4626]: fi Feb 23 06:42:16 crc kubenswrapper[4626]: Feb 23 06:42:16 crc kubenswrapper[4626]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 23 06:42:16 crc kubenswrapper[4626]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:42:16 crc kubenswrapper[4626]: --disable-webhook \ Feb 23 06:42:16 crc kubenswrapper[4626]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 23 06:42:16 crc kubenswrapper[4626]: --loglevel="${LOGLEVEL}" Feb 23 06:42:16 crc kubenswrapper[4626]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:42:16 crc kubenswrapper[4626]: > logger="UnhandledError" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.277522 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.277593 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.283517 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.292940 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.300431 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.306651 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.314845 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.322228 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.329657 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.335633 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.335706 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.335721 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.335742 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.335758 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.337370 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.344364 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.349964 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.357422 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.364516 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.371296 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.378742 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.437563 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.437664 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.437746 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.437809 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.437865 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.540026 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.540076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.540089 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.540110 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.540124 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.642248 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.642276 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.642286 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.642297 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.642308 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.713908 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.713951 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.714061 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714043 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.714086 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.714103 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714106 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714125 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714165 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714176 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:18.714149739 +0000 UTC m=+91.053478995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714181 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714208 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:18.714193833 +0000 UTC m=+91.053523099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714227 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:18.714215734 +0000 UTC m=+91.053545010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714298 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714334 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714352 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714431 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:18.714405834 +0000 UTC m=+91.053735099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.714543 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:42:18.714531561 +0000 UTC m=+91.053860828 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.743710 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.743748 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.743759 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.743775 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.743788 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.845693 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.845768 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.845784 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.845802 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.845817 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.947172 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.947203 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.947213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.947227 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.947235 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:16Z","lastTransitionTime":"2026-02-23T06:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.981730 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.981754 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.981737 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.981839 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.981900 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:16 crc kubenswrapper[4626]: E0223 06:42:16.981950 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:16 crc kubenswrapper[4626]: I0223 06:42:16.991462 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:05:24.355943342 +0000 UTC Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.049213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.049263 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.049275 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.049290 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.049300 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.074697 4626 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.151072 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.151191 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.151252 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.151331 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.151402 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.253841 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.253962 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.254026 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.254078 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.254139 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.356297 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.356335 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.356346 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.356363 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.356373 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.459130 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.459181 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.459196 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.459213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.459233 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.562091 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.562200 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.562258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.562314 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.562370 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.664521 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.664557 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.664567 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.664584 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.664596 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.766113 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.766143 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.766153 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.766167 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.766179 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.868212 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.868255 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.868267 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.868283 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.868292 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.970484 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.970535 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.970544 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.970563 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.970573 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:17Z","lastTransitionTime":"2026-02-23T06:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.991973 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 01:46:33.600867856 +0000 UTC Feb 23 06:42:17 crc kubenswrapper[4626]: I0223 06:42:17.995555 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.003695 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.010799 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.019360 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.027979 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.036832 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.044878 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.072047 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.072081 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.072091 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.072108 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.072118 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.173914 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.173950 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.173963 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.173978 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.173991 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.275540 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.275588 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.275601 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.275619 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.275634 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.377143 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.377179 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.377189 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.377203 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.377214 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.479546 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.479572 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.479583 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.479594 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.479604 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.581214 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.581242 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.581252 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.581264 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.581273 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.682699 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.682742 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.682751 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.682766 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.682776 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.729148 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.729266 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.729304 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.729333 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.729355 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729434 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729459 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729473 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729529 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729570 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729541 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729638 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729655 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729552 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:22.729534166 +0000 UTC m=+95.068863442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729730 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:22.729696312 +0000 UTC m=+95.069025588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729748 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:22.729741107 +0000 UTC m=+95.069070373 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729760 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:22.729754643 +0000 UTC m=+95.069083919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.729804 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:42:22.729788577 +0000 UTC m=+95.069117843 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.784650 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.784673 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.784682 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.784693 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.784702 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.886803 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.886927 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.886988 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.887047 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.887100 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.981793 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.981898 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.981938 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.982052 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.982138 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:18 crc kubenswrapper[4626]: E0223 06:42:18.982202 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.988944 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.988990 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.989001 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.989012 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.989023 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:18Z","lastTransitionTime":"2026-02-23T06:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:18 crc kubenswrapper[4626]: I0223 06:42:18.992405 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:00:23.829481814 +0000 UTC Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.090958 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.090996 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.091006 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.091017 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.091026 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.193189 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.193223 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.193232 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.193242 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.193251 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.295367 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.295430 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.295442 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.295465 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.295478 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.397808 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.397920 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.398011 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.398098 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.398182 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.500004 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.500132 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.500192 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.500268 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.500335 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.602076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.602108 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.602119 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.602133 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.602143 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.704834 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.704882 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.704895 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.704915 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.704928 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.808238 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.808277 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.808287 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.808304 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.808317 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.911146 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.911181 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.911191 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.911205 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.911214 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:19Z","lastTransitionTime":"2026-02-23T06:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:19 crc kubenswrapper[4626]: I0223 06:42:19.993467 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 08:17:36.041473001 +0000 UTC Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.013183 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.013352 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.013407 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.013465 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.013544 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.116235 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.116435 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.116519 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.116586 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.116649 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.219161 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.219201 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.219211 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.219227 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.219236 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.320767 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.320792 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.320802 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.320813 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.320821 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.422489 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.422531 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.422540 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.422552 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.422560 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.524623 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.524651 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.524661 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.524673 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.524685 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.626650 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.626696 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.626706 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.626717 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.626738 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.728846 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.728873 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.728881 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.728892 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.728901 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.831239 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.831274 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.831283 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.831297 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.831305 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.932988 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.933035 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.933046 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.933058 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.933066 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:20Z","lastTransitionTime":"2026-02-23T06:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.980972 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.980983 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:20 crc kubenswrapper[4626]: E0223 06:42:20.981091 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:20 crc kubenswrapper[4626]: E0223 06:42:20.981236 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.981394 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:20 crc kubenswrapper[4626]: E0223 06:42:20.981617 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:20 crc kubenswrapper[4626]: I0223 06:42:20.994530 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 11:31:51.609777123 +0000 UTC Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.034763 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.034795 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.034804 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.034815 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.034822 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.135902 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.135929 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.135955 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.135968 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.135976 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.238027 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.238053 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.238064 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.238076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.238085 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.340089 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.340140 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.340153 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.340168 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.340176 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.442000 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.442025 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.442035 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.442047 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.442054 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.543409 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.543442 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.543470 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.543483 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.543492 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.644820 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.644857 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.644870 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.644884 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.644895 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.747609 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.747648 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.747657 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.747672 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.747681 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.850411 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.850460 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.850472 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.850494 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.850531 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.953118 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.953157 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.953167 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.953183 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.953196 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:21Z","lastTransitionTime":"2026-02-23T06:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:21 crc kubenswrapper[4626]: I0223 06:42:21.994987 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 17:15:02.443162755 +0000 UTC Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.054634 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.054680 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.054693 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.054709 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.054721 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.157045 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.157093 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.157104 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.157122 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.157134 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.259258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.259301 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.259311 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.259329 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.259341 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.361123 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.361165 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.361176 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.361193 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.361202 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.463914 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.463952 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.463964 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.463979 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.463988 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.566233 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.566277 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.566289 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.566306 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.566320 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.668520 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.668566 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.668580 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.668598 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.668612 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.766132 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.766200 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.766234 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.766255 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.766277 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766434 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766457 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766530 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766549 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766556 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766574 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766463 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766615 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766558 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:30.766537002 +0000 UTC m=+103.105866268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766649 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:30.766633915 +0000 UTC m=+103.105963191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766684 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:42:30.766659604 +0000 UTC m=+103.105988880 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766728 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:30.766719357 +0000 UTC m=+103.106048633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.766766 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:30.766757118 +0000 UTC m=+103.106086394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.770222 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.770263 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.770275 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.770295 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.770308 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.872464 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.872514 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.872528 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.872543 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.872554 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.974235 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.974268 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.974280 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.974290 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.974297 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:22Z","lastTransitionTime":"2026-02-23T06:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.981677 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.981707 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.981782 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.981792 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.981892 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:22 crc kubenswrapper[4626]: E0223 06:42:22.981991 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:22 crc kubenswrapper[4626]: I0223 06:42:22.996138 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 23:41:05.346702459 +0000 UTC Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.075779 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.075942 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.076003 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.076076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.076142 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.177823 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.177955 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.178032 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.178098 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.178150 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.280513 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.280549 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.280560 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.280575 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.280584 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.382693 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.382725 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.382735 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.382773 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.382782 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.484317 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.484451 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.484549 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.484630 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.484694 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.586042 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.586076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.586087 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.586122 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.586134 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.687975 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.688001 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.688010 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.688022 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.688031 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.789347 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.789381 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.789389 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.789399 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.789408 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.891452 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.891483 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.891532 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.891544 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.891553 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.993374 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.993413 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.993424 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.993437 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.993445 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:23Z","lastTransitionTime":"2026-02-23T06:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:23 crc kubenswrapper[4626]: I0223 06:42:23.996806 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 23:10:39.831880541 +0000 UTC Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.096076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.096103 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.096112 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.096125 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.096135 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.198128 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.198161 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.198171 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.198185 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.198195 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.299772 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.299824 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.299835 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.299854 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.299866 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.401678 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.401715 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.401726 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.401742 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.401760 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.504020 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.504057 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.504067 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.504082 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.504092 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.606082 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.606135 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.606150 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.606166 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.606180 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.708336 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.708387 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.708403 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.708423 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.708436 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.810478 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.810535 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.810548 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.810564 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.810574 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.912472 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.912537 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.912548 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.912563 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.912573 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:24Z","lastTransitionTime":"2026-02-23T06:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.981220 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.981307 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:24 crc kubenswrapper[4626]: E0223 06:42:24.981353 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.981454 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:24 crc kubenswrapper[4626]: E0223 06:42:24.981640 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:24 crc kubenswrapper[4626]: E0223 06:42:24.981749 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:24 crc kubenswrapper[4626]: I0223 06:42:24.997422 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 03:24:41.983008164 +0000 UTC Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.014240 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.014271 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.014299 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.014316 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.014327 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.017277 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-km45b"] Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.017627 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.019946 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.020085 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.021116 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.028384 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.035435 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.041867 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.049154 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.056012 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.060814 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.066777 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.071620 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.117137 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.117183 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.117193 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.117211 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.117225 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.185974 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwjm\" (UniqueName: \"kubernetes.io/projected/939db1b0-d1bf-495e-a842-f0d102e2a420-kube-api-access-tfwjm\") pod \"node-resolver-km45b\" (UID: \"939db1b0-d1bf-495e-a842-f0d102e2a420\") " pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.186122 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/939db1b0-d1bf-495e-a842-f0d102e2a420-hosts-file\") pod \"node-resolver-km45b\" (UID: \"939db1b0-d1bf-495e-a842-f0d102e2a420\") " pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.219439 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.219470 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.219529 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.219542 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.219551 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.268677 4626 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.287350 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfwjm\" (UniqueName: \"kubernetes.io/projected/939db1b0-d1bf-495e-a842-f0d102e2a420-kube-api-access-tfwjm\") pod \"node-resolver-km45b\" (UID: \"939db1b0-d1bf-495e-a842-f0d102e2a420\") " pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.287393 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/939db1b0-d1bf-495e-a842-f0d102e2a420-hosts-file\") pod \"node-resolver-km45b\" (UID: \"939db1b0-d1bf-495e-a842-f0d102e2a420\") " pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.287585 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/939db1b0-d1bf-495e-a842-f0d102e2a420-hosts-file\") pod \"node-resolver-km45b\" (UID: \"939db1b0-d1bf-495e-a842-f0d102e2a420\") " pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.301087 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfwjm\" (UniqueName: \"kubernetes.io/projected/939db1b0-d1bf-495e-a842-f0d102e2a420-kube-api-access-tfwjm\") pod \"node-resolver-km45b\" (UID: \"939db1b0-d1bf-495e-a842-f0d102e2a420\") " pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.321264 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.321312 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.321324 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.321342 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.321352 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.329394 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-km45b" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.357564 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-84c69"] Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.359893 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2jvsw"] Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.360233 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lbzx5"] Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.360591 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.360984 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.361810 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.364867 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.365125 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.368633 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.368868 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.368889 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.368644 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.369089 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.369191 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.369257 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.369297 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.369383 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.369473 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.380006 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.388074 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.397795 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.403910 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.412799 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.420408 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.426159 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.426258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.426326 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.426403 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.426486 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.428732 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.438256 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.446257 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.453290 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.460814 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.467814 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.474598 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.481375 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488195 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488455 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjprv\" (UniqueName: \"kubernetes.io/projected/27fe907f-67db-4a19-a485-22debfb92983-kube-api-access-bjprv\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488493 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-cnibin\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488530 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-hostroot\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488562 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-system-cni-dir\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488585 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cnibin\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488606 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4t25\" (UniqueName: \"kubernetes.io/projected/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-kube-api-access-c4t25\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488629 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-k8s-cni-cncf-io\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488654 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-conf-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488680 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-os-release\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488699 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cni-binary-copy\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488724 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npl82\" (UniqueName: \"kubernetes.io/projected/1b11f67b-b1fe-456a-843e-471433062d6c-kube-api-access-npl82\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488766 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-netns\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488790 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-cni-multus\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488808 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-os-release\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488832 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1b11f67b-b1fe-456a-843e-471433062d6c-rootfs\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488925 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b11f67b-b1fe-456a-843e-471433062d6c-mcd-auth-proxy-config\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.488964 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-etc-kubernetes\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489008 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-kubelet\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489031 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27fe907f-67db-4a19-a485-22debfb92983-multus-daemon-config\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489053 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-multus-certs\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489072 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489110 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27fe907f-67db-4a19-a485-22debfb92983-cni-binary-copy\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489132 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489152 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b11f67b-b1fe-456a-843e-471433062d6c-proxy-tls\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489176 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-socket-dir-parent\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489197 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-system-cni-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489214 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-cni-bin\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.489270 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-cni-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.494137 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.499608 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.509803 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.517528 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.525957 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.528357 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.528388 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.528403 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.528421 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.528432 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.529355 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.529392 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.529406 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.529425 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.529436 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.536572 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.540031 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.540067 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.540077 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.540100 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.540113 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.547175 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.550010 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.550047 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.550058 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.550073 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.550083 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.557112 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.559991 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.560018 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.560029 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.560044 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.560054 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.568145 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.574441 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.574471 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.574481 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.574510 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.574520 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.581187 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.581331 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590431 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npl82\" (UniqueName: \"kubernetes.io/projected/1b11f67b-b1fe-456a-843e-471433062d6c-kube-api-access-npl82\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590470 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-cni-multus\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590533 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-os-release\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590570 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-netns\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590594 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1b11f67b-b1fe-456a-843e-471433062d6c-rootfs\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590620 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b11f67b-b1fe-456a-843e-471433062d6c-mcd-auth-proxy-config\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590621 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-cni-multus\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590681 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-etc-kubernetes\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590716 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1b11f67b-b1fe-456a-843e-471433062d6c-rootfs\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590642 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-etc-kubernetes\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590802 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-netns\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590741 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-os-release\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590823 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-multus-certs\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590877 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-multus-certs\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590924 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.590967 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-kubelet\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591005 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27fe907f-67db-4a19-a485-22debfb92983-multus-daemon-config\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591033 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27fe907f-67db-4a19-a485-22debfb92983-cni-binary-copy\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591060 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591070 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-kubelet\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591091 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b11f67b-b1fe-456a-843e-471433062d6c-proxy-tls\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591128 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-socket-dir-parent\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591155 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-system-cni-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591179 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-cni-bin\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591206 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-cni-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591262 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjprv\" (UniqueName: \"kubernetes.io/projected/27fe907f-67db-4a19-a485-22debfb92983-kube-api-access-bjprv\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591289 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-cnibin\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591314 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-hostroot\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591349 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-system-cni-dir\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591373 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cnibin\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591401 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4t25\" (UniqueName: \"kubernetes.io/projected/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-kube-api-access-c4t25\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591430 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-k8s-cni-cncf-io\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591457 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-conf-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591490 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-os-release\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591539 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cni-binary-copy\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591887 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27fe907f-67db-4a19-a485-22debfb92983-cni-binary-copy\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592055 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27fe907f-67db-4a19-a485-22debfb92983-multus-daemon-config\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592080 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-cnibin\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592231 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cni-binary-copy\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592267 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-run-k8s-cni-cncf-io\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592290 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-conf-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.591403 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b11f67b-b1fe-456a-843e-471433062d6c-mcd-auth-proxy-config\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592325 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-hostroot\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592340 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-os-release\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592349 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-system-cni-dir\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592371 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-cnibin\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592449 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-system-cni-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592491 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-socket-dir-parent\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592564 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-multus-cni-dir\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.592626 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27fe907f-67db-4a19-a485-22debfb92983-host-var-lib-cni-bin\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.593018 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.596708 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b11f67b-b1fe-456a-843e-471433062d6c-proxy-tls\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.602831 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npl82\" (UniqueName: \"kubernetes.io/projected/1b11f67b-b1fe-456a-843e-471433062d6c-kube-api-access-npl82\") pod \"machine-config-daemon-2jvsw\" (UID: \"1b11f67b-b1fe-456a-843e-471433062d6c\") " pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.604970 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4t25\" (UniqueName: \"kubernetes.io/projected/2cb6f72e-5acb-4a3b-8956-c8f89d47afe0-kube-api-access-c4t25\") pod \"multus-additional-cni-plugins-84c69\" (UID: \"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\") " pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.605466 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjprv\" (UniqueName: \"kubernetes.io/projected/27fe907f-67db-4a19-a485-22debfb92983-kube-api-access-bjprv\") pod \"multus-lbzx5\" (UID: \"27fe907f-67db-4a19-a485-22debfb92983\") " pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.630631 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.630657 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.630668 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.630684 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.630696 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.678411 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lbzx5" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.684335 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.689847 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-84c69" Feb 23 06:42:25 crc kubenswrapper[4626]: W0223 06:42:25.695920 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b11f67b_b1fe_456a_843e_471433062d6c.slice/crio-46d7a18e78a9bcf1a4a44fc318445b88695c36310222d38b6de1d89357a9e9bf WatchSource:0}: Error finding container 46d7a18e78a9bcf1a4a44fc318445b88695c36310222d38b6de1d89357a9e9bf: Status 404 returned error can't find the container with id 46d7a18e78a9bcf1a4a44fc318445b88695c36310222d38b6de1d89357a9e9bf Feb 23 06:42:25 crc kubenswrapper[4626]: W0223 06:42:25.709341 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb6f72e_5acb_4a3b_8956_c8f89d47afe0.slice/crio-6ea7cef9ce8ea96ad55482c4ed61c71f389d474e49a130af571504af6fbbc012 WatchSource:0}: Error finding container 6ea7cef9ce8ea96ad55482c4ed61c71f389d474e49a130af571504af6fbbc012: Status 404 returned error can't find the container with id 6ea7cef9ce8ea96ad55482c4ed61c71f389d474e49a130af571504af6fbbc012 Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.732480 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lhplf"] Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.736683 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.737622 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.737657 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.737670 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.737687 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.737699 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.741006 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.743114 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.743184 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.743559 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.743705 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.744181 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.746765 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.751936 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.762220 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.770686 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.778232 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.784669 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.796094 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.803381 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.812369 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.819354 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.826979 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.835595 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.840893 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.840923 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.840935 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.840950 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.840963 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.848818 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894517 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-slash\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894547 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-netd\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894585 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-kubelet\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894602 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-etc-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894621 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894711 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-systemd-units\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894746 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-node-log\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894780 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894842 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-script-lib\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894896 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-ovn-kubernetes\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894935 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-netns\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.894973 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-var-lib-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895001 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-ovn\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895024 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5475\" (UniqueName: \"kubernetes.io/projected/a4eb8735-20e6-4bd1-8965-4a360e39a919-kube-api-access-b5475\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895086 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-systemd\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895131 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-env-overrides\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895153 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-config\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895182 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovn-node-metrics-cert\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895219 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-log-socket\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.895240 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-bin\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.944638 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.944668 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.944677 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.944693 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.944703 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:25Z","lastTransitionTime":"2026-02-23T06:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.995755 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.995984 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-etc-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996023 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996047 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-kubelet\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996078 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-node-log\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996099 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-systemd-units\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996117 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996136 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-script-lib\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996158 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-ovn-kubernetes\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996177 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-netns\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996197 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-var-lib-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996215 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-ovn\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996238 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5475\" (UniqueName: \"kubernetes.io/projected/a4eb8735-20e6-4bd1-8965-4a360e39a919-kube-api-access-b5475\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996259 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-systemd\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996288 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-env-overrides\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996316 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-config\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996340 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovn-node-metrics-cert\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996366 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-log-socket\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996405 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-bin\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996426 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-slash\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996454 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-netd\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996548 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-netd\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996601 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-ovn-kubernetes\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996628 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-netns\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996650 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-var-lib-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996671 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-ovn\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.996964 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-systemd\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997025 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-bin\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997053 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-kubelet\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997064 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-log-socket\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997070 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-etc-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997095 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-node-log\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997096 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997103 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-script-lib\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997100 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-systemd-units\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997116 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-openvswitch\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997121 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-slash\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997586 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 22:36:25.066431079 +0000 UTC Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.997846 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-config\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.998348 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:42:25 crc kubenswrapper[4626]: E0223 06:42:25.998683 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:42:25 crc kubenswrapper[4626]: I0223 06:42:25.999059 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-env-overrides\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.003101 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovn-node-metrics-cert\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.012241 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5475\" (UniqueName: \"kubernetes.io/projected/a4eb8735-20e6-4bd1-8965-4a360e39a919-kube-api-access-b5475\") pod \"ovnkube-node-lhplf\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.046910 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.046949 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.046959 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.046977 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.046989 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.051616 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.149408 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.149735 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.149748 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.149769 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.149797 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.252479 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.252537 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.252549 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.252567 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.252581 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.301083 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f72e-5acb-4a3b-8956-c8f89d47afe0" containerID="cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945" exitCode=0 Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.301173 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerDied","Data":"cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.301214 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerStarted","Data":"6ea7cef9ce8ea96ad55482c4ed61c71f389d474e49a130af571504af6fbbc012"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.311099 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5" exitCode=0 Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.311167 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.311194 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"a699ecbb18f2fbbaae7dca1605ee06866be003785ac7fd125e047c3a9b94bfa7"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.314106 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.314184 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.314197 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"46d7a18e78a9bcf1a4a44fc318445b88695c36310222d38b6de1d89357a9e9bf"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.316728 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerStarted","Data":"ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.316775 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerStarted","Data":"b8c765eff7563f0dbf1e38975fa73657bf089cbbdf66bd6529ff8b140b141bc3"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.317217 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.319751 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:42:26 crc kubenswrapper[4626]: E0223 06:42:26.319888 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.320543 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-km45b" event={"ID":"939db1b0-d1bf-495e-a842-f0d102e2a420","Type":"ContainerStarted","Data":"1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.320642 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-km45b" event={"ID":"939db1b0-d1bf-495e-a842-f0d102e2a420","Type":"ContainerStarted","Data":"4653eafa39c8765366e12bf326487ca5d33c8559b864a0d57fbad4fe40c2ee86"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.327048 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.335833 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.343968 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.349079 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.356351 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.356618 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.356628 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.356645 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.356673 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.361334 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.373074 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.382214 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.392750 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.403759 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.415796 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.432334 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.444189 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.463885 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.463922 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.463931 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.463946 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.463957 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.474090 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.482152 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.492289 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.502729 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.516758 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.526619 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.534672 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.541974 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.549861 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.559427 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.567134 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.567678 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.567725 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.567739 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.567967 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.567980 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.576004 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.583132 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.670454 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.670527 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.670538 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.670573 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.670587 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.772793 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.772828 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.772839 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.772858 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.772871 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.877462 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.877517 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.877529 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.877548 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.877563 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.980051 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.980101 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.980110 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.980125 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.980135 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:26Z","lastTransitionTime":"2026-02-23T06:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.981600 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.981616 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.981727 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:26 crc kubenswrapper[4626]: E0223 06:42:26.981820 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:26 crc kubenswrapper[4626]: E0223 06:42:26.982057 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:26 crc kubenswrapper[4626]: E0223 06:42:26.982118 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:26 crc kubenswrapper[4626]: I0223 06:42:26.997937 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 22:23:13.165176076 +0000 UTC Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.082529 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.082558 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.082569 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.082586 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.082596 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.184837 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.184876 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.184885 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.184902 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.184912 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.286858 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.286898 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.286906 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.286921 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.286930 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.325414 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.325457 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.325468 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.325477 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.325488 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.325513 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.326849 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.326898 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.329406 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f72e-5acb-4a3b-8956-c8f89d47afe0" containerID="1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27" exitCode=0 Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.329436 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerDied","Data":"1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.340542 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.349666 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.356373 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.365738 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.374821 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.389254 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.389868 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.389896 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.389905 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.389916 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.389927 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.398545 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.406698 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.415069 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.423309 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.433871 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.443373 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.452151 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.459898 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.470246 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.479921 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.490314 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.492962 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.493007 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.493020 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.493037 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.493048 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.498835 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.510790 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.522270 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.531590 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.540578 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.551183 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.561012 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.568291 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.581672 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.595198 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.595330 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.595422 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.595533 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.595612 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.698203 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.698241 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.698253 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.698269 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.698282 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.705837 4626 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.801234 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.801273 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.801283 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.801298 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.801309 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.903821 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.903919 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.903990 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.904065 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.904120 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:27Z","lastTransitionTime":"2026-02-23T06:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.992182 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:27 crc kubenswrapper[4626]: I0223 06:42:27.998666 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:38:42.907472645 +0000 UTC Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.000054 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.007062 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.007088 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.007100 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.007116 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.007127 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.016769 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.025855 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.037048 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.046895 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.057729 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.070277 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.081740 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.093304 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.102843 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.112319 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.112362 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.112376 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.112394 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.112406 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.112718 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.121487 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.215291 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.215328 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.215339 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.215357 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.215367 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.317438 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.317482 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.317492 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.317532 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.317546 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.334348 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f72e-5acb-4a3b-8956-c8f89d47afe0" containerID="9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735" exitCode=0 Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.334390 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerDied","Data":"9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.353528 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.369027 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.377775 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.390660 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.400455 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.408718 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.415981 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.421000 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.421033 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.421043 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.421058 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.421069 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.429327 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.436723 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.446818 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.458579 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.468910 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.478092 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.523634 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.523671 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.523682 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.523698 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.523710 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.626613 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.626930 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.626946 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.626963 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.626974 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.729281 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.729324 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.729336 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.729355 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.729367 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.831240 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.831380 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.831470 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.831584 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.831645 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.933529 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.933569 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.933580 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.933594 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.933604 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:28Z","lastTransitionTime":"2026-02-23T06:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.981530 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:28 crc kubenswrapper[4626]: E0223 06:42:28.981646 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.981720 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.981746 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:28 crc kubenswrapper[4626]: E0223 06:42:28.981868 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:28 crc kubenswrapper[4626]: E0223 06:42:28.981947 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:28 crc kubenswrapper[4626]: I0223 06:42:28.999033 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 10:36:43.363150601 +0000 UTC Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.035715 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.035750 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.035764 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.035775 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.035795 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.138283 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.138320 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.138331 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.138349 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.138358 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.240308 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.240341 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.240352 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.240366 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.240377 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.340858 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.341688 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.341717 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.341727 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.341746 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.341758 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.343164 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f72e-5acb-4a3b-8956-c8f89d47afe0" containerID="e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb" exitCode=0 Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.343217 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerDied","Data":"e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.352847 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.370598 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.380721 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.389360 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.398073 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.408567 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.419529 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.427469 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.436624 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.445039 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.445683 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.445718 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.445728 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.445744 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.445756 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.453533 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.460863 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.476117 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:29Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.548190 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.548225 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.548236 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.548253 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.548265 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.650172 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.650209 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.650220 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.650239 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.650252 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.752167 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.752196 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.752207 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.752224 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.752235 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.854684 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.854717 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.854726 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.854737 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.854747 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.956529 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.956576 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.956589 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.956610 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:29 crc kubenswrapper[4626]: I0223 06:42:29.956624 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:29Z","lastTransitionTime":"2026-02-23T06:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:29.999927 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 22:26:51.452087208 +0000 UTC Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.058268 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.058299 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.058311 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.058326 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.058338 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.159870 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.159893 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.159902 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.159918 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.159929 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.262013 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.262041 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.262049 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.262061 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.262070 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.348079 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f72e-5acb-4a3b-8956-c8f89d47afe0" containerID="3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7" exitCode=0 Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.348136 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerDied","Data":"3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.361024 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.361944 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.366724 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.366751 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.366760 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.366772 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.366784 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.373679 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.385860 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.393233 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.401522 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.415080 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.425330 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.437480 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.447283 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.458352 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.467092 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.470265 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.470303 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.470316 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.470330 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.470341 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.476602 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.489102 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.497658 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.505684 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.518908 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.533073 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.543997 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.555470 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.567307 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.573202 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.573230 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.573240 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.573259 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.573271 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.578894 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.590804 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.602092 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.611546 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.621487 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.631580 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:30Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.675445 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.675481 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.675491 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.675550 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.675562 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.778320 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.778363 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.778372 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.778390 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.778401 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.841934 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842060 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:42:46.842039706 +0000 UTC m=+119.181368971 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.842107 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.842178 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.842202 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.842227 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842285 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842338 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:46.842325407 +0000 UTC m=+119.181654672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842360 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842379 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842386 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842414 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842427 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842391 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842481 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:46.842467605 +0000 UTC m=+119.181796861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842445 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842563 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:46.842555341 +0000 UTC m=+119.181884607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.842633 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:46.842572332 +0000 UTC m=+119.181901589 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.881014 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.881045 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.881056 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.881071 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.881084 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.981868 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.981903 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.982278 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.982378 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.982550 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:30 crc kubenswrapper[4626]: E0223 06:42:30.982777 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.983552 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.983586 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.983596 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.983610 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.983620 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:30Z","lastTransitionTime":"2026-02-23T06:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:30 crc kubenswrapper[4626]: I0223 06:42:30.996317 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.000457 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 21:32:58.591168848 +0000 UTC Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.085918 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.085941 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.085949 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.085961 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.085971 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.188258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.188294 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.188306 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.188321 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.188334 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.291452 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.291489 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.291519 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.291541 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.291555 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.370689 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.370988 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.371002 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.373372 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.377429 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f72e-5acb-4a3b-8956-c8f89d47afe0" containerID="40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c" exitCode=0 Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.377890 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerDied","Data":"40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.384049 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.394189 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.394213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.394223 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.394235 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.394245 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.397318 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.398530 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.415045 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.421086 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qtbvk"] Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.421779 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.423333 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.424367 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.425051 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.426161 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.430848 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.444016 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.453692 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.462541 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.470054 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.485599 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.498265 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.499631 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.499724 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.499737 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.499756 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.499769 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.510862 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.523035 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.537349 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.548446 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znjd\" (UniqueName: \"kubernetes.io/projected/27a832b0-4c0c-4f1a-9d07-770876ca1505-kube-api-access-4znjd\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.548553 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/27a832b0-4c0c-4f1a-9d07-770876ca1505-serviceca\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.548620 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27a832b0-4c0c-4f1a-9d07-770876ca1505-host\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.550952 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.559892 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.568688 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.584040 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.592197 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.602014 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.602367 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.602398 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.602409 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.602428 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.602438 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.613314 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.623447 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.644675 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.648964 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/27a832b0-4c0c-4f1a-9d07-770876ca1505-serviceca\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.649006 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27a832b0-4c0c-4f1a-9d07-770876ca1505-host\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.649043 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4znjd\" (UniqueName: \"kubernetes.io/projected/27a832b0-4c0c-4f1a-9d07-770876ca1505-kube-api-access-4znjd\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.649127 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27a832b0-4c0c-4f1a-9d07-770876ca1505-host\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.649950 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/27a832b0-4c0c-4f1a-9d07-770876ca1505-serviceca\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.657698 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.669787 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znjd\" (UniqueName: \"kubernetes.io/projected/27a832b0-4c0c-4f1a-9d07-770876ca1505-kube-api-access-4znjd\") pod \"node-ca-qtbvk\" (UID: \"27a832b0-4c0c-4f1a-9d07-770876ca1505\") " pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.679528 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.704706 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.704760 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.704771 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.704797 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.704819 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.704975 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.720609 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.732581 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.734351 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qtbvk" Feb 23 06:42:31 crc kubenswrapper[4626]: W0223 06:42:31.747030 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27a832b0_4c0c_4f1a_9d07_770876ca1505.slice/crio-236594513f8a8697e034d47286e3f168bb0bcfe39883233f4d854b628ce0bdac WatchSource:0}: Error finding container 236594513f8a8697e034d47286e3f168bb0bcfe39883233f4d854b628ce0bdac: Status 404 returned error can't find the container with id 236594513f8a8697e034d47286e3f168bb0bcfe39883233f4d854b628ce0bdac Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.752940 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.773873 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:31Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.807193 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.807222 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.807232 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.807250 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.807260 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.911024 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.911060 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.911068 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.911084 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:31 crc kubenswrapper[4626]: I0223 06:42:31.911097 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:31Z","lastTransitionTime":"2026-02-23T06:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.001277 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:34:12.690952338 +0000 UTC Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.013173 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.013228 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.013241 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.013257 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.013266 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.115306 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.115342 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.115352 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.115367 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.115377 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.217440 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.217470 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.217479 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.217491 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.217526 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.320184 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.320228 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.320239 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.320255 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.320265 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.386255 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" event={"ID":"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0","Type":"ContainerStarted","Data":"2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.388542 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qtbvk" event={"ID":"27a832b0-4c0c-4f1a-9d07-770876ca1505","Type":"ContainerStarted","Data":"ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.388603 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qtbvk" event={"ID":"27a832b0-4c0c-4f1a-9d07-770876ca1505","Type":"ContainerStarted","Data":"236594513f8a8697e034d47286e3f168bb0bcfe39883233f4d854b628ce0bdac"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.388929 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.398044 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.410270 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.410888 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.423073 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.423104 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.423117 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.423137 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.423146 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.429208 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.437963 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.446517 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.455569 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.474078 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.483011 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.491475 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.498891 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.509258 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.516981 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.525877 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.525916 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.525930 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.525945 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.525955 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.528601 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.539555 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.548380 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.556702 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.572206 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.581405 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.589279 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.598238 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.606746 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.613707 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.622448 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.627600 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.627635 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.627645 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.627661 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.627672 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.632051 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.641325 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.651060 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.660177 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.673366 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.682075 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.690738 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:32Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.730546 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.730580 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.730591 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.730607 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.730618 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.832648 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.832686 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.832697 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.832720 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.832736 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.935631 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.935679 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.935714 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.935735 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.935748 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:32Z","lastTransitionTime":"2026-02-23T06:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.981053 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.981077 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:32 crc kubenswrapper[4626]: I0223 06:42:32.981054 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:32 crc kubenswrapper[4626]: E0223 06:42:32.981217 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:32 crc kubenswrapper[4626]: E0223 06:42:32.981318 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:32 crc kubenswrapper[4626]: E0223 06:42:32.981405 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.001406 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 06:27:54.274236137 +0000 UTC Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.038720 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.038779 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.038798 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.038829 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.038847 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.141522 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.141568 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.141584 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.141603 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.141615 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.244304 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.244345 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.244356 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.244373 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.244384 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.346226 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.346266 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.346276 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.346297 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.346311 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.448932 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.448977 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.448987 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.449002 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.449012 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.550933 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.550961 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.550969 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.550981 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.550993 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.653703 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.653751 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.653761 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.653779 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.653790 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.756224 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.756464 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.756474 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.756487 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.756519 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.858790 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.858822 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.858831 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.858846 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.858855 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.960212 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.960247 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.960258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.960272 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:33 crc kubenswrapper[4626]: I0223 06:42:33.960281 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:33Z","lastTransitionTime":"2026-02-23T06:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.001592 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 05:19:06.886028068 +0000 UTC Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.062333 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.062359 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.062368 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.062379 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.062389 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.164443 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.164473 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.164481 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.164518 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.164530 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.266401 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.266674 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.266802 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.266913 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.266973 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.369010 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.369124 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.369184 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.369272 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.369335 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.396583 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/0.log" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.399380 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82" exitCode=1 Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.399422 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.400010 4626 scope.go:117] "RemoveContainer" containerID="6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.410197 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.421112 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.430442 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.445300 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.454300 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.462015 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.471545 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.471585 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.471595 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.471614 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.471625 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.476514 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:33Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0223 06:42:33.394267 6085 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.394691 6085 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395053 6085 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395141 6085 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395153 6085 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395190 6085 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0223 06:42:33.395243 6085 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:33.395324 6085 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.484833 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.496715 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.506801 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.516175 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.524413 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.535480 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.548861 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.556634 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:34Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.574649 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.574682 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.574694 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.574710 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.574718 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.677083 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.677130 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.677170 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.677187 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.677201 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.779009 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.779045 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.779055 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.779070 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.779080 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.881879 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.881923 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.881938 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.881958 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.881976 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.981861 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.981904 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.981914 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:34 crc kubenswrapper[4626]: E0223 06:42:34.982001 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:34 crc kubenswrapper[4626]: E0223 06:42:34.982094 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:34 crc kubenswrapper[4626]: E0223 06:42:34.982339 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.983832 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.983861 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.983870 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.983885 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:34 crc kubenswrapper[4626]: I0223 06:42:34.983902 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:34Z","lastTransitionTime":"2026-02-23T06:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.001803 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:51:10.937452968 +0000 UTC Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.085911 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.085948 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.085963 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.085980 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.085994 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.188337 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.188747 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.188757 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.188774 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.188785 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.291394 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.291438 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.291450 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.291475 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.291489 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.393391 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.393421 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.393430 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.393443 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.393453 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.403433 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/1.log" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.404076 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/0.log" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.406371 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0" exitCode=1 Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.406414 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.406457 4626 scope.go:117] "RemoveContainer" containerID="6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.407024 4626 scope.go:117] "RemoveContainer" containerID="ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0" Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.407183 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.417209 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.428479 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.437732 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.447322 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.466927 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.476142 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.484398 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.491951 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.497060 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.497095 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.497107 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.497128 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.497142 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.506914 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fea42993bd60bd46d580e2ced0acddc8480a285fa0f5ec83c422ce75e4beb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:33Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0223 06:42:33.394267 6085 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.394691 6085 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395053 6085 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395141 6085 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395153 6085 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:33.395190 6085 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0223 06:42:33.395243 6085 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:33.395324 6085 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.514599 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.523412 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.531849 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.539581 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.546764 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.556414 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.599323 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.599364 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.599375 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.599390 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.599401 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.701526 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.701573 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.701583 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.701599 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.701611 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.803566 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.803608 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.803618 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.803632 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.803642 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.827341 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.827466 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.827558 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.827620 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.827684 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.836580 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.839366 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.839400 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.839410 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.839424 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.839434 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.848878 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.851480 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.851526 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.851535 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.851546 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.851555 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.859362 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.861854 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.861886 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.861895 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.861907 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.861916 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.872109 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.874890 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.874926 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.874938 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.874961 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.874974 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.884348 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:35Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:35 crc kubenswrapper[4626]: E0223 06:42:35.884459 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.906040 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.906075 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.906085 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.906101 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:35 crc kubenswrapper[4626]: I0223 06:42:35.906115 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:35Z","lastTransitionTime":"2026-02-23T06:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.002597 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 01:43:45.188713191 +0000 UTC Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.007968 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.008012 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.008024 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.008046 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.008059 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.111661 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.111909 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.111921 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.111936 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.111947 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.214071 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.214106 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.214118 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.214135 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.214150 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.316106 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.316142 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.316150 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.316167 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.316178 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.411173 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/1.log" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.414697 4626 scope.go:117] "RemoveContainer" containerID="ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0" Feb 23 06:42:36 crc kubenswrapper[4626]: E0223 06:42:36.414875 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.417474 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.417527 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.417539 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.417552 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.417563 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.423366 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.432819 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.444118 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.454350 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.464669 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.474028 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.485088 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.500529 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.509416 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.518758 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.519779 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.519817 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.519829 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.519855 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.519867 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.527730 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.535661 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.542694 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.555804 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.563175 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.622175 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.622218 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.622230 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.622254 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.622265 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.724048 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.724163 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.724231 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.724301 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.724360 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.826059 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.826141 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.826152 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.826162 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.826173 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.928812 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.928865 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.928876 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.928896 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.928907 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:36Z","lastTransitionTime":"2026-02-23T06:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.981306 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.981344 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:36 crc kubenswrapper[4626]: I0223 06:42:36.981366 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:36 crc kubenswrapper[4626]: E0223 06:42:36.981458 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:36 crc kubenswrapper[4626]: E0223 06:42:36.981556 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:36 crc kubenswrapper[4626]: E0223 06:42:36.981682 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.003059 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 02:57:19.381177152 +0000 UTC Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.030871 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.030901 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.030915 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.030931 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.030941 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.132772 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.132820 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.132832 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.132870 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.132880 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.181683 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6"] Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.182512 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.184326 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.184491 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.197830 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.200172 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h954v\" (UniqueName: \"kubernetes.io/projected/c07a7b21-10ef-4b99-87cf-80a2a941c363-kube-api-access-h954v\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.200213 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c07a7b21-10ef-4b99-87cf-80a2a941c363-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.200239 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c07a7b21-10ef-4b99-87cf-80a2a941c363-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.200407 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c07a7b21-10ef-4b99-87cf-80a2a941c363-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.206321 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.214491 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.222445 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.231039 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.235157 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.235187 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.235198 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.235216 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.235227 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.240207 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.247709 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.259587 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.268484 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.278336 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.287589 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.297397 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.300754 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c07a7b21-10ef-4b99-87cf-80a2a941c363-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.300788 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h954v\" (UniqueName: \"kubernetes.io/projected/c07a7b21-10ef-4b99-87cf-80a2a941c363-kube-api-access-h954v\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.300821 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c07a7b21-10ef-4b99-87cf-80a2a941c363-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.300847 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c07a7b21-10ef-4b99-87cf-80a2a941c363-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.301458 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c07a7b21-10ef-4b99-87cf-80a2a941c363-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.301530 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c07a7b21-10ef-4b99-87cf-80a2a941c363-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.308298 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c07a7b21-10ef-4b99-87cf-80a2a941c363-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.311216 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.314571 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h954v\" (UniqueName: \"kubernetes.io/projected/c07a7b21-10ef-4b99-87cf-80a2a941c363-kube-api-access-h954v\") pod \"ovnkube-control-plane-749d76644c-rqdv6\" (UID: \"c07a7b21-10ef-4b99-87cf-80a2a941c363\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.320991 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.329507 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.337404 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.337432 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.337442 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.337459 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.337469 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.338316 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.439652 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.439686 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.439699 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.439716 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.439730 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.497082 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.542202 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.542236 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.542246 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.542261 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.542275 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.645523 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.645563 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.645575 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.645593 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.645605 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.747565 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.747601 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.747612 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.747630 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.747642 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.850441 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.850476 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.850484 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.850515 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.850526 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.886880 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ls5wf"] Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.887601 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:37 crc kubenswrapper[4626]: E0223 06:42:37.887693 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.898854 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.906287 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chbhw\" (UniqueName: \"kubernetes.io/projected/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-kube-api-access-chbhw\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.906335 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.908472 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.916769 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.924717 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.935205 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.945950 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.952971 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.953017 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.953031 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.953052 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.953065 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:37Z","lastTransitionTime":"2026-02-23T06:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.956995 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.969044 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.979462 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:37 crc kubenswrapper[4626]: I0223 06:42:37.995801 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.003910 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:47:10.691532609 +0000 UTC Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.005133 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.006975 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.007055 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chbhw\" (UniqueName: \"kubernetes.io/projected/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-kube-api-access-chbhw\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.007198 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.007279 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:38.507258333 +0000 UTC m=+110.846587599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.014779 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.020072 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chbhw\" (UniqueName: \"kubernetes.io/projected/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-kube-api-access-chbhw\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.021849 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.044926 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.055259 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.055307 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.055317 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.055336 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.055347 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.062429 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.094863 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.108115 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.118965 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.128914 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.137961 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.149566 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.157206 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.157245 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.157255 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.157272 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.157283 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.162268 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.177338 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.186629 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.197558 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.211408 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.221363 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.230296 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.239154 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.252972 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.259311 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.259346 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.259356 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.259371 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.259381 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.262465 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.271038 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.280171 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.287843 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.361826 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.361949 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.362021 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.362089 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.362154 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.420660 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" event={"ID":"c07a7b21-10ef-4b99-87cf-80a2a941c363","Type":"ContainerStarted","Data":"b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.420700 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" event={"ID":"c07a7b21-10ef-4b99-87cf-80a2a941c363","Type":"ContainerStarted","Data":"be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.420711 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" event={"ID":"c07a7b21-10ef-4b99-87cf-80a2a941c363","Type":"ContainerStarted","Data":"ef29ec8b914714ef145f7f5fea307f90157b6e23c48ca196d1153dbdec373adf"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.429818 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.438134 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.452609 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.463615 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.463688 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.463734 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.463745 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.463767 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.463779 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.471975 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.479559 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.493221 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.500685 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.508148 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.512286 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.512826 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.512911 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:39.512891922 +0000 UTC m=+111.852221188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.517616 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.526465 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.535144 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.543831 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.551734 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.563087 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.565421 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.565454 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.565465 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.565482 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.565509 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.573395 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.586061 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.667758 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.667881 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.667960 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.668048 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.668124 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.770669 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.770696 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.770708 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.770721 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.770731 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.873131 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.873163 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.873173 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.873186 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.873196 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.975031 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.975077 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.975086 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.975101 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.975109 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:38Z","lastTransitionTime":"2026-02-23T06:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.981431 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.981539 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.981660 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:38 crc kubenswrapper[4626]: I0223 06:42:38.981734 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.981888 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:38 crc kubenswrapper[4626]: E0223 06:42:38.982006 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.004949 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 07:06:59.444056957 +0000 UTC Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.077572 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.077602 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.077612 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.077625 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.077634 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.179969 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.180001 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.180010 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.180023 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.180031 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.282369 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.282400 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.282412 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.282427 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.282437 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.384929 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.384973 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.384984 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.384995 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.385002 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.487425 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.487468 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.487483 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.487514 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.487525 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.522561 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:39 crc kubenswrapper[4626]: E0223 06:42:39.522733 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:39 crc kubenswrapper[4626]: E0223 06:42:39.522808 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:41.522789448 +0000 UTC m=+113.862118714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.589758 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.589791 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.589801 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.589812 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.589820 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.692339 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.692377 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.692387 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.692400 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.692409 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.794377 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.794399 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.794408 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.794419 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.794426 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.895794 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.895830 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.895841 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.895863 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.895873 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:39Z","lastTransitionTime":"2026-02-23T06:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.981897 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:39 crc kubenswrapper[4626]: E0223 06:42:39.982036 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:39 crc kubenswrapper[4626]: I0223 06:42:39.982547 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.000047 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.000079 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.000091 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.000105 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.000115 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.005590 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 14:22:45.399364556 +0000 UTC Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.103221 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.103244 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.103252 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.103265 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.103275 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.205705 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.205753 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.205764 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.205784 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.205801 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.308212 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.308255 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.308266 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.308283 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.308295 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.410484 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.410530 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.410538 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.410552 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.410562 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.429040 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.431149 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.431415 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.440962 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.448395 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.460915 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.467895 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.474593 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.483156 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.492195 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.500072 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.507225 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.513119 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.513152 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.513162 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.513175 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.513185 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.514879 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.524039 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.534653 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.542842 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.554202 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.564648 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.573648 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.588230 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:40Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.615165 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.615206 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.615219 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.615236 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.615250 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.718102 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.718148 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.718158 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.718180 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.718194 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.820016 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.820047 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.820056 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.820077 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.820095 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.922256 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.922296 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.922305 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.922319 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.922327 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:40Z","lastTransitionTime":"2026-02-23T06:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.981304 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.981349 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:40 crc kubenswrapper[4626]: E0223 06:42:40.981443 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:40 crc kubenswrapper[4626]: I0223 06:42:40.981324 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:40 crc kubenswrapper[4626]: E0223 06:42:40.981608 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:40 crc kubenswrapper[4626]: E0223 06:42:40.981744 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.006621 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 17:28:16.447188015 +0000 UTC Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.024552 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.024604 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.024616 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.024636 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.024647 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.126599 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.126727 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.126786 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.126848 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.126918 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.228801 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.228828 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.228836 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.228850 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.228859 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.330987 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.331032 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.331043 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.331059 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.331071 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.433661 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.433694 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.433703 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.433718 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.433726 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.535438 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.535474 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.535485 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.535524 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.535536 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.540776 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:41 crc kubenswrapper[4626]: E0223 06:42:41.540928 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:41 crc kubenswrapper[4626]: E0223 06:42:41.540990 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:45.54097525 +0000 UTC m=+117.880304516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.636987 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.637010 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.637019 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.637030 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.637040 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.739486 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.739534 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.739544 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.739554 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.739562 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.841304 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.841339 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.841347 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.841363 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.841374 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.943325 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.943345 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.943355 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.943366 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.943374 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:41Z","lastTransitionTime":"2026-02-23T06:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:41 crc kubenswrapper[4626]: I0223 06:42:41.981939 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:41 crc kubenswrapper[4626]: E0223 06:42:41.982098 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.007571 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:27:02.57277532 +0000 UTC Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.045213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.045243 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.045251 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.045264 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.045272 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.146841 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.146972 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.147036 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.147108 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.147177 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.248861 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.248907 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.248917 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.248929 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.248938 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.350850 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.351219 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.351285 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.351350 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.351407 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.453685 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.453722 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.453733 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.453748 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.453756 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.556149 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.556191 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.556200 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.556218 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.556229 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.658424 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.658475 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.658486 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.658517 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.658527 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.760585 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.760618 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.760628 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.760644 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.760656 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.862728 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.862760 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.862784 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.862796 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.862805 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.964860 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.964922 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.964938 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.964951 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.964961 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:42Z","lastTransitionTime":"2026-02-23T06:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.981033 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.981049 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:42 crc kubenswrapper[4626]: I0223 06:42:42.981051 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:42 crc kubenswrapper[4626]: E0223 06:42:42.981124 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:42 crc kubenswrapper[4626]: E0223 06:42:42.981216 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:42 crc kubenswrapper[4626]: E0223 06:42:42.981291 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.008443 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 22:31:27.590001354 +0000 UTC Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.066378 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.066417 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.066431 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.066447 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.066457 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.168045 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.168080 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.168106 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.168117 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.168125 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.269912 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.269966 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.269976 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.269987 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.270016 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.372358 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.372402 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.372413 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.372429 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.372440 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.473963 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.473996 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.474023 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.474036 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.474044 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.575909 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.575933 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.575942 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.575956 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.575967 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.677889 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.677919 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.677929 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.677940 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.677948 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.779668 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.779692 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.779704 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.779716 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.779725 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.881310 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.881344 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.881354 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.881368 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.881380 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.981151 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:43 crc kubenswrapper[4626]: E0223 06:42:43.981317 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.983382 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.983478 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.983573 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.983629 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:43 crc kubenswrapper[4626]: I0223 06:42:43.983696 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:43Z","lastTransitionTime":"2026-02-23T06:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.009573 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 01:21:55.773535342 +0000 UTC Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.085764 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.085806 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.085817 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.085834 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.085846 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.188341 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.188377 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.188387 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.188403 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.188415 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.290222 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.290255 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.290265 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.290278 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.290289 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.392212 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.392256 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.392266 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.392284 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.392298 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.494402 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.494438 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.494448 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.494461 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.494470 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.596653 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.596676 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.596685 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.596699 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.596707 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.698443 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.698471 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.698482 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.698494 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.698521 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.800537 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.800579 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.800590 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.800606 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.800628 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.902522 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.902552 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.902562 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.902576 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.902584 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:44Z","lastTransitionTime":"2026-02-23T06:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.981998 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.982053 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:44 crc kubenswrapper[4626]: E0223 06:42:44.982100 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:44 crc kubenswrapper[4626]: E0223 06:42:44.982158 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:44 crc kubenswrapper[4626]: I0223 06:42:44.981999 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:44 crc kubenswrapper[4626]: E0223 06:42:44.982232 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.004480 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.004526 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.004537 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.004551 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.004560 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.010598 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 04:25:49.906766139 +0000 UTC Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.106790 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.106939 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.107010 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.107080 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.107138 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.208823 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.208854 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.208864 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.208877 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.208886 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.310874 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.310909 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.310935 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.310945 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.310951 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.412587 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.412650 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.412661 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.412673 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.412680 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.514126 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.514248 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.514336 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.514436 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.514564 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.580627 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:45 crc kubenswrapper[4626]: E0223 06:42:45.580980 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:45 crc kubenswrapper[4626]: E0223 06:42:45.581028 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:42:53.581012796 +0000 UTC m=+125.920342061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.617056 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.617099 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.617116 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.617134 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.617147 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.719253 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.719280 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.719289 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.719301 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.719310 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.821611 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.821646 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.821656 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.821668 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.821678 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.924011 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.924042 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.924053 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.924064 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.924075 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:45Z","lastTransitionTime":"2026-02-23T06:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:45 crc kubenswrapper[4626]: I0223 06:42:45.981673 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:45 crc kubenswrapper[4626]: E0223 06:42:45.981788 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.011400 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 22:01:31.58192898 +0000 UTC Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.025833 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.025886 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.025905 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.025922 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.025931 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.086809 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.086839 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.086848 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.086857 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.086865 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.097126 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.100209 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.100236 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.100249 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.100259 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.100267 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.109428 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.111949 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.111988 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.111998 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.112014 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.112025 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.126947 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.129676 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.129714 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.129726 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.129741 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.129754 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.138958 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.141384 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.141418 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.141428 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.141439 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.141448 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.150179 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:46Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.150412 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.151579 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.151692 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.151760 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.151829 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.151886 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.254321 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.254463 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.254548 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.254627 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.254696 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.357145 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.357276 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.357346 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.357410 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.357489 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.459156 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.459183 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.459193 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.459203 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.459210 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.561113 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.561167 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.561194 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.561208 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.561219 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.663200 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.663224 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.663233 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.663242 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.663250 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.764971 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.765076 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.765087 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.765121 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.765131 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.867118 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.867150 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.867161 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.867173 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.867182 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.891002 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.891058 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.891082 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.891105 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891154 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:43:18.891134807 +0000 UTC m=+151.230464074 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891198 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.891201 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891213 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891209 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891240 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891252 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891274 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891297 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:43:18.891283969 +0000 UTC m=+151.230613245 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891225 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891303 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891320 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:43:18.891308455 +0000 UTC m=+151.230637751 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891335 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:43:18.891328885 +0000 UTC m=+151.230658161 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.891349 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:43:18.891342259 +0000 UTC m=+151.230671535 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.968692 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.968719 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.968728 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.968740 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.968749 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:46Z","lastTransitionTime":"2026-02-23T06:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.981159 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.981175 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.981177 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.981249 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.981303 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:46 crc kubenswrapper[4626]: E0223 06:42:46.981334 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:46 crc kubenswrapper[4626]: I0223 06:42:46.982037 4626 scope.go:117] "RemoveContainer" containerID="ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.012393 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 08:42:10.018887371 +0000 UTC Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.071444 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.071478 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.071490 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.071524 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.071539 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.178151 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.178185 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.178196 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.178213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.178224 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.280364 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.280393 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.280401 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.280413 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.280445 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.382759 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.382830 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.382851 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.382880 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.382895 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.452936 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/1.log" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.454976 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.455472 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.466518 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.484488 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.490334 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.490426 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.490478 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.490555 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.490629 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.495250 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.516812 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.526198 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.536297 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.545186 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.560517 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.568558 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.580124 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.592688 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.592717 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.592727 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.592743 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.592753 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.595763 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.605549 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.614358 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.622569 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.631850 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.641461 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.652478 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.695168 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.695202 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.695213 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.695226 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.695238 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.797594 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.797624 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.797635 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.797649 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.797660 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.899201 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.899248 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.899258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.899276 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.899289 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:47Z","lastTransitionTime":"2026-02-23T06:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.981106 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:47 crc kubenswrapper[4626]: E0223 06:42:47.981231 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:47 crc kubenswrapper[4626]: I0223 06:42:47.992837 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:47 crc kubenswrapper[4626]: E0223 06:42:47.999534 4626 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.002653 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.011671 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.012791 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 22:39:23.43043302 +0000 UTC Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.019535 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.028805 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.037978 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.047317 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.058986 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: E0223 06:42:48.067233 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.068633 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.089031 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.097949 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.106654 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.115615 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.132203 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.143666 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.150535 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.158008 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.460607 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/2.log" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.461221 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/1.log" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.468941 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b" exitCode=1 Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.468998 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b"} Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.469051 4626 scope.go:117] "RemoveContainer" containerID="ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.469854 4626 scope.go:117] "RemoveContainer" containerID="7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b" Feb 23 06:42:48 crc kubenswrapper[4626]: E0223 06:42:48.470169 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.490740 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.503181 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.513811 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.525281 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.539488 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef2a5a33644105c2f6ecae4d39ec26e238a07b76f1964c0de12483b34e18e0f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:35Z\\\",\\\"message\\\":\\\"SNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0223 06:42:35.097381 6245 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nF0223 06:42:35.097383 6245 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.547549 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.556206 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.564555 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.571709 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.579299 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.587143 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.595587 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.605142 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.614772 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.624155 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.633250 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.643767 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.981206 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.981259 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:48 crc kubenswrapper[4626]: E0223 06:42:48.981323 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:48 crc kubenswrapper[4626]: E0223 06:42:48.981374 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:48 crc kubenswrapper[4626]: I0223 06:42:48.981652 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:48 crc kubenswrapper[4626]: E0223 06:42:48.981864 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.014011 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 14:28:20.404362931 +0000 UTC Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.474298 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/2.log" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.478141 4626 scope.go:117] "RemoveContainer" containerID="7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b" Feb 23 06:42:49 crc kubenswrapper[4626]: E0223 06:42:49.478403 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.487861 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.502404 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.510811 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.518755 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.526833 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.535264 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.543293 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.551878 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.559001 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.571528 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.581716 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.590996 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.600763 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.610382 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.624447 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.633484 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.641996 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:49Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:49 crc kubenswrapper[4626]: I0223 06:42:49.982001 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:49 crc kubenswrapper[4626]: E0223 06:42:49.982124 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:50 crc kubenswrapper[4626]: I0223 06:42:50.014148 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:34:28.246295837 +0000 UTC Feb 23 06:42:50 crc kubenswrapper[4626]: I0223 06:42:50.981315 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:50 crc kubenswrapper[4626]: E0223 06:42:50.981429 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:50 crc kubenswrapper[4626]: I0223 06:42:50.981612 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:50 crc kubenswrapper[4626]: E0223 06:42:50.981660 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:50 crc kubenswrapper[4626]: I0223 06:42:50.981657 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:50 crc kubenswrapper[4626]: E0223 06:42:50.981803 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.015125 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:42:39.689729447 +0000 UTC Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.496317 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.505908 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.515274 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.530152 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.538141 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.546101 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.555150 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.562485 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.569756 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.576454 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.588570 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.595000 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.602613 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.611674 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.621909 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.630932 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.639754 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.648146 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:51Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:51 crc kubenswrapper[4626]: I0223 06:42:51.981616 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:51 crc kubenswrapper[4626]: E0223 06:42:51.981794 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:52 crc kubenswrapper[4626]: I0223 06:42:52.015643 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 21:06:58.83374519 +0000 UTC Feb 23 06:42:52 crc kubenswrapper[4626]: I0223 06:42:52.981204 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:52 crc kubenswrapper[4626]: I0223 06:42:52.981272 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:52 crc kubenswrapper[4626]: I0223 06:42:52.981307 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:52 crc kubenswrapper[4626]: E0223 06:42:52.982056 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:52 crc kubenswrapper[4626]: E0223 06:42:52.982170 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:52 crc kubenswrapper[4626]: E0223 06:42:52.982393 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:53 crc kubenswrapper[4626]: I0223 06:42:53.016569 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 15:00:20.174184338 +0000 UTC Feb 23 06:42:53 crc kubenswrapper[4626]: E0223 06:42:53.068779 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:42:53 crc kubenswrapper[4626]: I0223 06:42:53.650051 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:53 crc kubenswrapper[4626]: E0223 06:42:53.650181 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:53 crc kubenswrapper[4626]: E0223 06:42:53.650231 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:43:09.650217049 +0000 UTC m=+141.989546316 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:42:53 crc kubenswrapper[4626]: I0223 06:42:53.981613 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:53 crc kubenswrapper[4626]: E0223 06:42:53.981752 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:54 crc kubenswrapper[4626]: I0223 06:42:54.016649 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:19:37.793063768 +0000 UTC Feb 23 06:42:54 crc kubenswrapper[4626]: I0223 06:42:54.981562 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:54 crc kubenswrapper[4626]: E0223 06:42:54.981680 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:54 crc kubenswrapper[4626]: I0223 06:42:54.981742 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:54 crc kubenswrapper[4626]: I0223 06:42:54.981845 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:54 crc kubenswrapper[4626]: E0223 06:42:54.981889 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:54 crc kubenswrapper[4626]: E0223 06:42:54.981919 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:55 crc kubenswrapper[4626]: I0223 06:42:55.016787 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 00:02:43.046848195 +0000 UTC Feb 23 06:42:55 crc kubenswrapper[4626]: I0223 06:42:55.981787 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:55 crc kubenswrapper[4626]: E0223 06:42:55.981979 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.017876 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:12:19.550879182 +0000 UTC Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.179283 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.179321 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.179331 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.179345 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.179354 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:56Z","lastTransitionTime":"2026-02-23T06:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.189770 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:56Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.192455 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.192564 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.192628 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.192686 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.192738 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:56Z","lastTransitionTime":"2026-02-23T06:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.201566 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:56Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.204450 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.204487 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.204517 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.204532 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.204542 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:56Z","lastTransitionTime":"2026-02-23T06:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.213401 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:56Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.216351 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.216385 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.216396 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.216411 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.216421 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:56Z","lastTransitionTime":"2026-02-23T06:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.225672 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:56Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.228890 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.228931 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.228946 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.228975 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.228988 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:42:56Z","lastTransitionTime":"2026-02-23T06:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.238162 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:56Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.238299 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.981913 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.982092 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.982343 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:56 crc kubenswrapper[4626]: I0223 06:42:56.982608 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.982796 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:56 crc kubenswrapper[4626]: E0223 06:42:56.982894 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:57 crc kubenswrapper[4626]: I0223 06:42:57.018725 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 12:04:37.742399711 +0000 UTC Feb 23 06:42:57 crc kubenswrapper[4626]: I0223 06:42:57.981864 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:57 crc kubenswrapper[4626]: E0223 06:42:57.982101 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:42:57 crc kubenswrapper[4626]: I0223 06:42:57.997631 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:57Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.012132 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.018854 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 02:43:56.171969423 +0000 UTC Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.023929 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.040727 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.051461 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.066243 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: E0223 06:42:58.069970 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.088205 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.098217 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.109248 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.120242 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.130860 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.140878 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.149828 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.161697 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.170023 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.181843 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.191610 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:42:58Z is after 2025-08-24T17:21:41Z" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.981127 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.981155 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:42:58 crc kubenswrapper[4626]: I0223 06:42:58.981205 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:42:58 crc kubenswrapper[4626]: E0223 06:42:58.981780 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:42:58 crc kubenswrapper[4626]: E0223 06:42:58.981960 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:42:58 crc kubenswrapper[4626]: E0223 06:42:58.982127 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:42:59 crc kubenswrapper[4626]: I0223 06:42:59.019690 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 05:55:37.829936561 +0000 UTC Feb 23 06:42:59 crc kubenswrapper[4626]: I0223 06:42:59.982152 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:42:59 crc kubenswrapper[4626]: E0223 06:42:59.982313 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:00 crc kubenswrapper[4626]: I0223 06:43:00.020716 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:50:43.621600573 +0000 UTC Feb 23 06:43:00 crc kubenswrapper[4626]: I0223 06:43:00.981359 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:00 crc kubenswrapper[4626]: E0223 06:43:00.981533 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:00 crc kubenswrapper[4626]: I0223 06:43:00.981683 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:00 crc kubenswrapper[4626]: E0223 06:43:00.981769 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:00 crc kubenswrapper[4626]: I0223 06:43:00.981779 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:00 crc kubenswrapper[4626]: E0223 06:43:00.982040 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:01 crc kubenswrapper[4626]: I0223 06:43:01.021818 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 09:29:02.568941236 +0000 UTC Feb 23 06:43:01 crc kubenswrapper[4626]: I0223 06:43:01.982064 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:01 crc kubenswrapper[4626]: E0223 06:43:01.982304 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:02 crc kubenswrapper[4626]: I0223 06:43:02.022312 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:12:56.819928765 +0000 UTC Feb 23 06:43:02 crc kubenswrapper[4626]: I0223 06:43:02.981339 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:02 crc kubenswrapper[4626]: I0223 06:43:02.981378 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:02 crc kubenswrapper[4626]: I0223 06:43:02.981356 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:02 crc kubenswrapper[4626]: E0223 06:43:02.982073 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:02 crc kubenswrapper[4626]: E0223 06:43:02.982209 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:02 crc kubenswrapper[4626]: E0223 06:43:02.982373 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:02 crc kubenswrapper[4626]: I0223 06:43:02.994765 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 23 06:43:03 crc kubenswrapper[4626]: I0223 06:43:03.022749 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 05:14:34.708320818 +0000 UTC Feb 23 06:43:03 crc kubenswrapper[4626]: E0223 06:43:03.070929 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:03 crc kubenswrapper[4626]: I0223 06:43:03.981625 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:03 crc kubenswrapper[4626]: E0223 06:43:03.982021 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:03 crc kubenswrapper[4626]: I0223 06:43:03.982613 4626 scope.go:117] "RemoveContainer" containerID="7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b" Feb 23 06:43:03 crc kubenswrapper[4626]: E0223 06:43:03.982790 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:43:04 crc kubenswrapper[4626]: I0223 06:43:04.023210 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 03:24:20.575780299 +0000 UTC Feb 23 06:43:04 crc kubenswrapper[4626]: I0223 06:43:04.982085 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:04 crc kubenswrapper[4626]: I0223 06:43:04.982195 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:04 crc kubenswrapper[4626]: E0223 06:43:04.982251 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:04 crc kubenswrapper[4626]: E0223 06:43:04.982372 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:04 crc kubenswrapper[4626]: I0223 06:43:04.982487 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:04 crc kubenswrapper[4626]: E0223 06:43:04.982577 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:05 crc kubenswrapper[4626]: I0223 06:43:05.023602 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:11:25.329914382 +0000 UTC Feb 23 06:43:05 crc kubenswrapper[4626]: I0223 06:43:05.981893 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:05 crc kubenswrapper[4626]: E0223 06:43:05.982057 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.024484 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 03:48:37.077024442 +0000 UTC Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.255354 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.255408 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.255421 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.255444 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.255456 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:06Z","lastTransitionTime":"2026-02-23T06:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.270774 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:06Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.275021 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.275056 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.275069 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.275099 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.275112 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:06Z","lastTransitionTime":"2026-02-23T06:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.286162 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:06Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.289556 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.289592 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.289602 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.289619 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.289631 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:06Z","lastTransitionTime":"2026-02-23T06:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.301165 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:06Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.305300 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.305359 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.305372 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.305395 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.305411 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:06Z","lastTransitionTime":"2026-02-23T06:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.316289 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:06Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.319455 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.319489 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.319518 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.319533 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.319544 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:06Z","lastTransitionTime":"2026-02-23T06:43:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.328809 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:06Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.328952 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.981705 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.981856 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.981899 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:06 crc kubenswrapper[4626]: I0223 06:43:06.981921 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.982043 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:06 crc kubenswrapper[4626]: E0223 06:43:06.982078 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:07 crc kubenswrapper[4626]: I0223 06:43:07.025027 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 03:50:51.418416367 +0000 UTC Feb 23 06:43:07 crc kubenswrapper[4626]: I0223 06:43:07.981275 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:07 crc kubenswrapper[4626]: E0223 06:43:07.981431 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:07 crc kubenswrapper[4626]: I0223 06:43:07.994139 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:07Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.013762 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.022795 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.025550 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 21:30:40.276480557 +0000 UTC Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.030772 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.040331 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.048603 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.058828 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.068223 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: E0223 06:43:08.071838 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.079568 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.089049 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.099988 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.111088 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.120409 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.130905 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.140682 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.156165 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.165592 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.175757 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.981681 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.981815 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:08 crc kubenswrapper[4626]: E0223 06:43:08.982012 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:08 crc kubenswrapper[4626]: E0223 06:43:08.981856 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:08 crc kubenswrapper[4626]: I0223 06:43:08.982210 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:08 crc kubenswrapper[4626]: E0223 06:43:08.982419 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:09 crc kubenswrapper[4626]: I0223 06:43:09.025875 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:38:41.076313191 +0000 UTC Feb 23 06:43:09 crc kubenswrapper[4626]: I0223 06:43:09.683695 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:09 crc kubenswrapper[4626]: E0223 06:43:09.683892 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:43:09 crc kubenswrapper[4626]: E0223 06:43:09.684008 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:43:41.683987358 +0000 UTC m=+174.023316624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:43:09 crc kubenswrapper[4626]: I0223 06:43:09.981690 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:09 crc kubenswrapper[4626]: E0223 06:43:09.981895 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:10 crc kubenswrapper[4626]: I0223 06:43:10.026711 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 17:10:49.869974123 +0000 UTC Feb 23 06:43:10 crc kubenswrapper[4626]: I0223 06:43:10.981380 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:10 crc kubenswrapper[4626]: E0223 06:43:10.981804 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:10 crc kubenswrapper[4626]: I0223 06:43:10.981406 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:10 crc kubenswrapper[4626]: E0223 06:43:10.982006 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:10 crc kubenswrapper[4626]: I0223 06:43:10.981380 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:10 crc kubenswrapper[4626]: E0223 06:43:10.982223 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.027834 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:47:31.338811112 +0000 UTC Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.549314 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/0.log" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.549584 4626 generic.go:334] "Generic (PLEG): container finished" podID="27fe907f-67db-4a19-a485-22debfb92983" containerID="ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd" exitCode=1 Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.549700 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerDied","Data":"ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd"} Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.550275 4626 scope.go:117] "RemoveContainer" containerID="ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.563295 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.572990 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.580606 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.590381 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.600798 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.610110 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.621304 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.633247 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.647658 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.658620 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.666842 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.674262 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.687386 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.695366 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.703551 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.712274 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.721030 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.728798 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:11 crc kubenswrapper[4626]: I0223 06:43:11.981544 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:11 crc kubenswrapper[4626]: E0223 06:43:11.981707 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.027983 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 06:49:00.168853891 +0000 UTC Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.555639 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/0.log" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.555703 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerStarted","Data":"fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649"} Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.567379 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.579867 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.589055 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.601534 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.610010 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.617635 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.624153 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.637100 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.643691 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.650098 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.658096 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.665790 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.674040 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.681446 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.688900 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.697391 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.707033 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.723364 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.981335 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.981390 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:12 crc kubenswrapper[4626]: I0223 06:43:12.981486 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:12 crc kubenswrapper[4626]: E0223 06:43:12.981608 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:12 crc kubenswrapper[4626]: E0223 06:43:12.981712 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:12 crc kubenswrapper[4626]: E0223 06:43:12.981882 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:13 crc kubenswrapper[4626]: I0223 06:43:13.028669 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 14:22:23.053113772 +0000 UTC Feb 23 06:43:13 crc kubenswrapper[4626]: E0223 06:43:13.072960 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:13 crc kubenswrapper[4626]: I0223 06:43:13.981565 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:13 crc kubenswrapper[4626]: E0223 06:43:13.981718 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:14 crc kubenswrapper[4626]: I0223 06:43:14.029015 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 22:46:29.271265041 +0000 UTC Feb 23 06:43:14 crc kubenswrapper[4626]: I0223 06:43:14.981927 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:14 crc kubenswrapper[4626]: I0223 06:43:14.982017 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:14 crc kubenswrapper[4626]: E0223 06:43:14.982077 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:14 crc kubenswrapper[4626]: E0223 06:43:14.982213 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:14 crc kubenswrapper[4626]: I0223 06:43:14.982295 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:14 crc kubenswrapper[4626]: E0223 06:43:14.982387 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:15 crc kubenswrapper[4626]: I0223 06:43:15.029822 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:02:05.557562002 +0000 UTC Feb 23 06:43:15 crc kubenswrapper[4626]: I0223 06:43:15.982188 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:15 crc kubenswrapper[4626]: E0223 06:43:15.983234 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.030309 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 03:03:13.650120835 +0000 UTC Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.352962 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.353002 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.353013 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.353028 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.353038 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:16Z","lastTransitionTime":"2026-02-23T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.363208 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.366869 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.366904 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.366916 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.366932 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.366942 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:16Z","lastTransitionTime":"2026-02-23T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.377181 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.380306 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.380345 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.380358 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.380375 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.380387 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:16Z","lastTransitionTime":"2026-02-23T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.389540 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.392361 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.392398 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.392407 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.392425 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.392440 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:16Z","lastTransitionTime":"2026-02-23T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.401099 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.403800 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.403835 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.403849 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.403862 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.403872 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:16Z","lastTransitionTime":"2026-02-23T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.413129 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.413241 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.981719 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.981820 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.981844 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.981989 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.982128 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:16 crc kubenswrapper[4626]: E0223 06:43:16.982473 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:16 crc kubenswrapper[4626]: I0223 06:43:16.982717 4626 scope.go:117] "RemoveContainer" containerID="7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.031365 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:22:41.568762711 +0000 UTC Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.576257 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/2.log" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.579349 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2"} Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.579875 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.591758 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.601705 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.612022 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.626990 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.634413 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.641758 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.652698 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.663388 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.675710 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.685612 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.697060 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.710689 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.731785 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.752697 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.762338 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.772598 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.781039 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.794728 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.981910 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:17 crc kubenswrapper[4626]: E0223 06:43:17.982096 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:17 crc kubenswrapper[4626]: I0223 06:43:17.993566 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.004089 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.014019 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.023563 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.032103 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 22:03:17.340801796 +0000 UTC Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.032217 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.040175 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.049318 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.060010 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.073616 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.076461 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.086326 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.096078 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.105359 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.112622 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.120158 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.128791 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.137339 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.145270 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.164639 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.584807 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/3.log" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.585263 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/2.log" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.587415 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" exitCode=1 Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.587455 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2"} Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.587513 4626 scope.go:117] "RemoveContainer" containerID="7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.588102 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.588244 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.600170 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.611015 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.621201 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.631679 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.640329 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.650449 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.661549 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.673172 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.686789 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.697723 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.707236 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.716986 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.725874 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.734548 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.743871 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.752113 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.767052 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7aaa87325a17821106b4fbcfa316ae4302e572ee94dcd6222cf5ca7a393d147b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:42:47Z\\\",\\\"message\\\":\\\"reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.629142 6509 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 06:42:47.629305 6509 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 06:42:47.630821 6509 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0223 06:42:47.630961 6509 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 06:42:47.631067 6509 factory.go:656] Stopping watch factory\\\\nI0223 06:42:47.631094 6509 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 06:42:47.631102 6509 handler.go:208] Removed *v1.Node event handler 2\\\\nI0223 06:42:47.649608 6509 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0223 06:42:47.649633 6509 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0223 06:42:47.649697 6509 ovnkube.go:599] Stopped ovnkube\\\\nI0223 06:42:47.649730 6509 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0223 06:42:47.649794 6509 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:17Z\\\",\\\"message\\\":\\\"etry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660257 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0223 06:43:17.660263 6849 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 06:43:17.660274 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 06:43:17.660209 6849 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0223 06:43:17.660279 6849 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660283 6849 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660287 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0223 06:43:17.660289 6849 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660009 6849 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.775731 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.960161 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.960268 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960348 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.960328005 +0000 UTC m=+215.299657281 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960390 4626 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960432 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.960425811 +0000 UTC m=+215.299755077 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.960390 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960538 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960557 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960575 4626 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.960591 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960615 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.960603088 +0000 UTC m=+215.299932364 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.960658 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960781 4626 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960873 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.960855507 +0000 UTC m=+215.300184773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960873 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960916 4626 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960935 4626 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.960973 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.960966468 +0000 UTC m=+215.300295734 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.981443 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.981481 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:18 crc kubenswrapper[4626]: I0223 06:43:18.981485 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.981572 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.981747 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:18 crc kubenswrapper[4626]: E0223 06:43:18.981790 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.032186 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 08:26:31.211903606 +0000 UTC Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.592617 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/3.log" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.595172 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:43:19 crc kubenswrapper[4626]: E0223 06:43:19.595333 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.610002 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.618162 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.626889 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.635117 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.643397 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.651401 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.658251 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.670965 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:17Z\\\",\\\"message\\\":\\\"etry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660257 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0223 06:43:17.660263 6849 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 06:43:17.660274 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 06:43:17.660209 6849 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0223 06:43:17.660279 6849 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660283 6849 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660287 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0223 06:43:17.660289 6849 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660009 6849 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.677636 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.684577 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.691041 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.700689 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.712772 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.721052 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.728882 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.736238 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.744956 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.754389 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:19 crc kubenswrapper[4626]: I0223 06:43:19.981237 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:19 crc kubenswrapper[4626]: E0223 06:43:19.981390 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:20 crc kubenswrapper[4626]: I0223 06:43:20.032984 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 15:45:57.611641439 +0000 UTC Feb 23 06:43:20 crc kubenswrapper[4626]: I0223 06:43:20.981294 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:20 crc kubenswrapper[4626]: I0223 06:43:20.981331 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:20 crc kubenswrapper[4626]: E0223 06:43:20.981656 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:20 crc kubenswrapper[4626]: E0223 06:43:20.981669 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:20 crc kubenswrapper[4626]: I0223 06:43:20.981343 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:20 crc kubenswrapper[4626]: E0223 06:43:20.981915 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:21 crc kubenswrapper[4626]: I0223 06:43:21.033216 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 22:26:26.434101556 +0000 UTC Feb 23 06:43:21 crc kubenswrapper[4626]: I0223 06:43:21.981554 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:21 crc kubenswrapper[4626]: E0223 06:43:21.981684 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:22 crc kubenswrapper[4626]: I0223 06:43:22.033952 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 08:09:56.081486587 +0000 UTC Feb 23 06:43:22 crc kubenswrapper[4626]: I0223 06:43:22.981283 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:22 crc kubenswrapper[4626]: I0223 06:43:22.981397 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:22 crc kubenswrapper[4626]: E0223 06:43:22.981486 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:22 crc kubenswrapper[4626]: I0223 06:43:22.981703 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:22 crc kubenswrapper[4626]: E0223 06:43:22.981705 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:22 crc kubenswrapper[4626]: E0223 06:43:22.981913 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:22 crc kubenswrapper[4626]: I0223 06:43:22.992224 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 23 06:43:23 crc kubenswrapper[4626]: I0223 06:43:23.034130 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 01:42:06.86803943 +0000 UTC Feb 23 06:43:23 crc kubenswrapper[4626]: E0223 06:43:23.075320 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:23 crc kubenswrapper[4626]: I0223 06:43:23.981987 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:23 crc kubenswrapper[4626]: E0223 06:43:23.982153 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:24 crc kubenswrapper[4626]: I0223 06:43:24.035151 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 10:33:08.54030507 +0000 UTC Feb 23 06:43:24 crc kubenswrapper[4626]: I0223 06:43:24.981423 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:24 crc kubenswrapper[4626]: I0223 06:43:24.981637 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:24 crc kubenswrapper[4626]: I0223 06:43:24.981587 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:24 crc kubenswrapper[4626]: E0223 06:43:24.981835 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:24 crc kubenswrapper[4626]: E0223 06:43:24.982014 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:24 crc kubenswrapper[4626]: E0223 06:43:24.982155 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:25 crc kubenswrapper[4626]: I0223 06:43:25.035243 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:54:22.660322713 +0000 UTC Feb 23 06:43:25 crc kubenswrapper[4626]: I0223 06:43:25.981206 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:25 crc kubenswrapper[4626]: E0223 06:43:25.981372 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.036021 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 02:05:39.876337105 +0000 UTC Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.489958 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.490379 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.490392 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.490423 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.490436 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:26Z","lastTransitionTime":"2026-02-23T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.501553 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.506110 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.506170 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.506185 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.506207 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.506220 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:26Z","lastTransitionTime":"2026-02-23T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.518923 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.527519 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.527547 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.527556 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.527569 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.527577 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:26Z","lastTransitionTime":"2026-02-23T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.537276 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.539762 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.539788 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.539799 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.539811 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.539819 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:26Z","lastTransitionTime":"2026-02-23T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.548578 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.550944 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.550979 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.550990 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.551023 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.551034 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:26Z","lastTransitionTime":"2026-02-23T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.558984 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.559090 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.981722 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.981742 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:26 crc kubenswrapper[4626]: I0223 06:43:26.981722 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.981860 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.981909 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:26 crc kubenswrapper[4626]: E0223 06:43:26.982000 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:27 crc kubenswrapper[4626]: I0223 06:43:27.036841 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 04:09:30.072797169 +0000 UTC Feb 23 06:43:27 crc kubenswrapper[4626]: I0223 06:43:27.981664 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:27 crc kubenswrapper[4626]: E0223 06:43:27.981851 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:27 crc kubenswrapper[4626]: I0223 06:43:27.998825 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c03de4-ae89-46ed-af39-b527a31bbbef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb3b6468591bc11855d004b0b5ae0f9292185085b3f843f96cbada8f30137baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:41:20.024574 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:41:20.025551 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:41:20.026430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:41:20.027151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:41:48.195867 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:41:50.186205 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:41:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2948892f6e9d19854d41fbe361ed6fa6b597f31a5a25a076d59e7378451a97c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f967d40becf0537406ba63da0c005ab507e8b8bac92379acd1fd456dc18eec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:27Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.009452 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.019597 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.027016 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.037558 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 12:00:28.837187306 +0000 UTC Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.041601 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:17Z\\\",\\\"message\\\":\\\"etry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660257 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0223 06:43:17.660263 6849 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 06:43:17.660274 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 06:43:17.660209 6849 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0223 06:43:17.660279 6849 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660283 6849 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660287 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0223 06:43:17.660289 6849 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660009 6849 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.061353 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.069102 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.076367 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: E0223 06:43:28.076728 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.086278 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.095675 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.103752 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.111765 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.119336 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.127372 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.136488 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.149090 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.157464 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.168040 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.177983 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.981406 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.981406 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:28 crc kubenswrapper[4626]: I0223 06:43:28.981600 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:28 crc kubenswrapper[4626]: E0223 06:43:28.981653 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:28 crc kubenswrapper[4626]: E0223 06:43:28.981827 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:28 crc kubenswrapper[4626]: E0223 06:43:28.981966 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:29 crc kubenswrapper[4626]: I0223 06:43:29.038616 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:40:06.768190297 +0000 UTC Feb 23 06:43:29 crc kubenswrapper[4626]: I0223 06:43:29.981871 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:29 crc kubenswrapper[4626]: E0223 06:43:29.982331 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:30 crc kubenswrapper[4626]: I0223 06:43:30.039533 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 07:34:10.940528375 +0000 UTC Feb 23 06:43:30 crc kubenswrapper[4626]: I0223 06:43:30.982636 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:30 crc kubenswrapper[4626]: E0223 06:43:30.983491 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:30 crc kubenswrapper[4626]: I0223 06:43:30.982658 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:30 crc kubenswrapper[4626]: I0223 06:43:30.982636 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:30 crc kubenswrapper[4626]: E0223 06:43:30.984942 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:30 crc kubenswrapper[4626]: E0223 06:43:30.985062 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:31 crc kubenswrapper[4626]: I0223 06:43:31.039926 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 01:51:14.736434414 +0000 UTC Feb 23 06:43:31 crc kubenswrapper[4626]: I0223 06:43:31.981232 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:31 crc kubenswrapper[4626]: E0223 06:43:31.981418 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:32 crc kubenswrapper[4626]: I0223 06:43:32.041028 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 08:40:15.968078095 +0000 UTC Feb 23 06:43:32 crc kubenswrapper[4626]: I0223 06:43:32.981217 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:32 crc kubenswrapper[4626]: I0223 06:43:32.981282 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:32 crc kubenswrapper[4626]: I0223 06:43:32.981407 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:32 crc kubenswrapper[4626]: E0223 06:43:32.981398 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:32 crc kubenswrapper[4626]: E0223 06:43:32.981798 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:32 crc kubenswrapper[4626]: E0223 06:43:32.981956 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:33 crc kubenswrapper[4626]: I0223 06:43:33.042137 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 21:00:11.610032606 +0000 UTC Feb 23 06:43:33 crc kubenswrapper[4626]: E0223 06:43:33.078386 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:33 crc kubenswrapper[4626]: I0223 06:43:33.982032 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:33 crc kubenswrapper[4626]: E0223 06:43:33.982216 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:34 crc kubenswrapper[4626]: I0223 06:43:34.042697 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:07:55.185427911 +0000 UTC Feb 23 06:43:34 crc kubenswrapper[4626]: I0223 06:43:34.981114 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:34 crc kubenswrapper[4626]: I0223 06:43:34.981226 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:34 crc kubenswrapper[4626]: E0223 06:43:34.981295 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:34 crc kubenswrapper[4626]: I0223 06:43:34.981446 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:34 crc kubenswrapper[4626]: E0223 06:43:34.981595 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:34 crc kubenswrapper[4626]: E0223 06:43:34.981711 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:34 crc kubenswrapper[4626]: I0223 06:43:34.982287 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:43:34 crc kubenswrapper[4626]: E0223 06:43:34.982449 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:43:35 crc kubenswrapper[4626]: I0223 06:43:35.043157 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:25:30.106660219 +0000 UTC Feb 23 06:43:35 crc kubenswrapper[4626]: I0223 06:43:35.981176 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:35 crc kubenswrapper[4626]: E0223 06:43:35.981365 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.043765 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 05:53:57.709489616 +0000 UTC Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.845581 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.845616 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.845624 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.845638 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.845647 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:36Z","lastTransitionTime":"2026-02-23T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.854978 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.857777 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.857806 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.857824 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.857841 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.857850 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:36Z","lastTransitionTime":"2026-02-23T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.866617 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.869033 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.869075 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.869085 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.869101 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.869113 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:36Z","lastTransitionTime":"2026-02-23T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.878435 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.881389 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.881426 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.881436 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.881450 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.881461 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:36Z","lastTransitionTime":"2026-02-23T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.889787 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.892355 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.892385 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.892394 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.892406 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.892426 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:36Z","lastTransitionTime":"2026-02-23T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.899975 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.900071 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.981758 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.981848 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.981886 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:36 crc kubenswrapper[4626]: I0223 06:43:36.981855 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.982026 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:36 crc kubenswrapper[4626]: E0223 06:43:36.982043 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:37 crc kubenswrapper[4626]: I0223 06:43:37.044341 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:42:11.080618738 +0000 UTC Feb 23 06:43:37 crc kubenswrapper[4626]: I0223 06:43:37.981261 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:37 crc kubenswrapper[4626]: E0223 06:43:37.981452 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:37 crc kubenswrapper[4626]: I0223 06:43:37.995960 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:37Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.008064 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.023701 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.033959 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.043716 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.044760 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 07:01:58.015690962 +0000 UTC Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.053306 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.060732 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.070188 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c03de4-ae89-46ed-af39-b527a31bbbef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb3b6468591bc11855d004b0b5ae0f9292185085b3f843f96cbada8f30137baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:41:20.024574 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:41:20.025551 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:41:20.026430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:41:20.027151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:41:48.195867 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:41:50.186205 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:41:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2948892f6e9d19854d41fbe361ed6fa6b597f31a5a25a076d59e7378451a97c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f967d40becf0537406ba63da0c005ab507e8b8bac92379acd1fd456dc18eec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: E0223 06:43:38.079049 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.079393 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.092744 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.101037 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.115534 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:17Z\\\",\\\"message\\\":\\\"etry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660257 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0223 06:43:17.660263 6849 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 06:43:17.660274 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 06:43:17.660209 6849 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0223 06:43:17.660279 6849 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660283 6849 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660287 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0223 06:43:17.660289 6849 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660009 6849 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.123308 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.131676 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.150298 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.160723 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.169634 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.177276 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.185023 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:38Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.981492 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.981519 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:38 crc kubenswrapper[4626]: E0223 06:43:38.981689 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:38 crc kubenswrapper[4626]: E0223 06:43:38.981811 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:38 crc kubenswrapper[4626]: I0223 06:43:38.982023 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:38 crc kubenswrapper[4626]: E0223 06:43:38.982232 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:39 crc kubenswrapper[4626]: I0223 06:43:39.045760 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 18:03:38.41089468 +0000 UTC Feb 23 06:43:39 crc kubenswrapper[4626]: I0223 06:43:39.981407 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:39 crc kubenswrapper[4626]: E0223 06:43:39.981646 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:40 crc kubenswrapper[4626]: I0223 06:43:40.046414 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 03:59:43.866158925 +0000 UTC Feb 23 06:43:40 crc kubenswrapper[4626]: I0223 06:43:40.981579 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:40 crc kubenswrapper[4626]: I0223 06:43:40.981702 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:40 crc kubenswrapper[4626]: E0223 06:43:40.981747 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:40 crc kubenswrapper[4626]: I0223 06:43:40.981819 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:40 crc kubenswrapper[4626]: E0223 06:43:40.981912 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:40 crc kubenswrapper[4626]: E0223 06:43:40.981985 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:41 crc kubenswrapper[4626]: I0223 06:43:41.046909 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 02:43:13.919209407 +0000 UTC Feb 23 06:43:41 crc kubenswrapper[4626]: I0223 06:43:41.775087 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:41 crc kubenswrapper[4626]: E0223 06:43:41.775315 4626 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:43:41 crc kubenswrapper[4626]: E0223 06:43:41.775411 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs podName:53b6af64-b3dc-44ae-96bd-90ab1b79dc08 nodeName:}" failed. No retries permitted until 2026-02-23 06:44:45.77538984 +0000 UTC m=+238.114719096 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs") pod "network-metrics-daemon-ls5wf" (UID: "53b6af64-b3dc-44ae-96bd-90ab1b79dc08") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:43:41 crc kubenswrapper[4626]: I0223 06:43:41.981614 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:41 crc kubenswrapper[4626]: E0223 06:43:41.981753 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:42 crc kubenswrapper[4626]: I0223 06:43:42.047320 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 04:08:30.206233124 +0000 UTC Feb 23 06:43:42 crc kubenswrapper[4626]: I0223 06:43:42.981308 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:42 crc kubenswrapper[4626]: I0223 06:43:42.981354 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:42 crc kubenswrapper[4626]: I0223 06:43:42.981441 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:42 crc kubenswrapper[4626]: E0223 06:43:42.981471 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:42 crc kubenswrapper[4626]: E0223 06:43:42.981603 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:42 crc kubenswrapper[4626]: E0223 06:43:42.981725 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:43 crc kubenswrapper[4626]: I0223 06:43:43.048151 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 00:01:30.820997359 +0000 UTC Feb 23 06:43:43 crc kubenswrapper[4626]: E0223 06:43:43.080306 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:43 crc kubenswrapper[4626]: I0223 06:43:43.981387 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:43 crc kubenswrapper[4626]: E0223 06:43:43.981606 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:44 crc kubenswrapper[4626]: I0223 06:43:44.049285 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 04:02:02.796946484 +0000 UTC Feb 23 06:43:44 crc kubenswrapper[4626]: I0223 06:43:44.981689 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:44 crc kubenswrapper[4626]: I0223 06:43:44.981741 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:44 crc kubenswrapper[4626]: I0223 06:43:44.981739 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:44 crc kubenswrapper[4626]: E0223 06:43:44.981875 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:44 crc kubenswrapper[4626]: E0223 06:43:44.981999 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:44 crc kubenswrapper[4626]: E0223 06:43:44.982208 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:45 crc kubenswrapper[4626]: I0223 06:43:45.049886 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 03:10:43.596505814 +0000 UTC Feb 23 06:43:45 crc kubenswrapper[4626]: I0223 06:43:45.981177 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:45 crc kubenswrapper[4626]: E0223 06:43:45.981591 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:45 crc kubenswrapper[4626]: I0223 06:43:45.982255 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:43:45 crc kubenswrapper[4626]: E0223 06:43:45.982399 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:43:46 crc kubenswrapper[4626]: I0223 06:43:46.050139 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 08:16:25.888368987 +0000 UTC Feb 23 06:43:46 crc kubenswrapper[4626]: I0223 06:43:46.981762 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:46 crc kubenswrapper[4626]: I0223 06:43:46.981863 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:46 crc kubenswrapper[4626]: E0223 06:43:46.981922 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:46 crc kubenswrapper[4626]: I0223 06:43:46.981870 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:46 crc kubenswrapper[4626]: E0223 06:43:46.982104 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:46 crc kubenswrapper[4626]: E0223 06:43:46.982143 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.050563 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 23:00:59.338767511 +0000 UTC Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.165310 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.165343 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.165352 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.165367 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.165380 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:47Z","lastTransitionTime":"2026-02-23T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.177819 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.180971 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.181015 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.181029 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.181049 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.181062 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:47Z","lastTransitionTime":"2026-02-23T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.191847 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.195193 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.195230 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.195242 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.195258 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.195277 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:47Z","lastTransitionTime":"2026-02-23T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.204036 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.207154 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.207205 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.207218 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.207238 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.207250 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:47Z","lastTransitionTime":"2026-02-23T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.215857 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.218836 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.218877 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.218891 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.218910 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.218925 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:47Z","lastTransitionTime":"2026-02-23T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.229267 4626 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f11baa89-c04c-40b7-af0e-799ac4cacb38\\\",\\\"systemUUID\\\":\\\"f91cd4af-3be1-4260-a65d-11f80cafe5a5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.229414 4626 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.981339 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:47 crc kubenswrapper[4626]: E0223 06:43:47.981832 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:47 crc kubenswrapper[4626]: I0223 06:43:47.992275 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c07a7b21-10ef-4b99-87cf-80a2a941c363\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be9857e83aa341e5670933ec58a96e60ce4551822885cb227a167b7d936054f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2717e3576af78b5391c7c3cc2b1997dfdc9385df3557dd29b7cabc6087601f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h954v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rqdv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.000119 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c7123c21-4435-4da3-955b-b25dcab80ddc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de4bf58ce1c331cca89b2242d3896befb70b7c089ba8714fa11e9013e06c198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad54d22d3dfee1626f2a853066d7811474fd1608fddc487f9d6c5f3d42132db6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:47Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.012396 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:57Z\\\",\\\"message\\\":\\\" shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0223 06:41:57.444886 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0223 06:41:57.444897 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0223 06:41:57.444970 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0223 06:41:57.444987 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0223 06:41:57.445040 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1771828916\\\\\\\\\\\\\\\" (2026-02-23 06:41:55 +0000 UTC to 2026-03-25 06:41:56 +0000 UTC (now=2026-02-23 06:41:57.444999054 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445160 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1771828917\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1771828917\\\\\\\\\\\\\\\" (2026-02-23 05:41:56 +0000 UTC to 2027-02-23 05:41:56 +0000 UTC (now=2026-02-23 06:41:57.44513968 +0000 UTC))\\\\\\\"\\\\nI0223 06:41:57.445183 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0223 06:41:57.445208 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0223 06:41:57.445208 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319241986/tls.crt::/tmp/serving-cert-2319241986/tls.key\\\\\\\"\\\\nI0223 06:41:57.445220 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0223 06:41:57.445210 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.022844 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7efe1fc9c0564cd0ca5cd86d4f8ae277bcfef11a27579b2f9a0de63d24a4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.032199 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.040887 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b11f67b-b1fe-456a-843e-471433062d6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20940839b42856bb3b5bdf48e493435706d0ce5ac411dd32a8ccb3a681c905a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npl82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2jvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.050922 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 13:46:00.890397319 +0000 UTC Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.051961 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lbzx5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27fe907f-67db-4a19-a485-22debfb92983\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:43:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:11Z\\\",\\\"message\\\":\\\"2026-02-23T06:42:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85\\\\n2026-02-23T06:42:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93a3b1a1-47eb-4242-b98a-e03d1513ba85 to /host/opt/cni/bin/\\\\n2026-02-23T06:42:26Z [verbose] multus-daemon started\\\\n2026-02-23T06:42:26Z [verbose] Readiness Indicator file check\\\\n2026-02-23T06:43:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjprv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lbzx5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.064447 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-84c69" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cb6f72e-5acb-4a3b-8956-c8f89d47afe0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bf28394b0d97e16c27e37e12c52e9719d8459abf63d191e49e67230c8cb4cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb1bf9e7f192898ab79cd5e7adc853070d26166261396eb47546069b18b03945\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1672c8e0cac0158d838a9e94a846bf8ea99e6aad73629b18688fbf0a26777e27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc2e432112a56c1905a8f94f3e6919d96937a04df1213ee23ecadc655feb735\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9405e17a87716a11d8bacbadf74875d9c3452eadfa8b1ce1597905ff5a266eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f8afcc511fefb61e51c253e59c32bd67a7b01780718094023df8bfed8f9e3f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40384923f4948da44b1f6b48342d51576794cfc8268400b134ae5f69e8a54e7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c4t25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-84c69\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.078839 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f77e44f-fa76-43c7-8558-66817da5daba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a66b18d8b67c8a6474ff65a6b86e37f23b8c8d2bfda220a077bb1d075faa4de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b5f7e1e7aacfd103544cdac9aa12995d81aace40852eb7165f0206593d5e38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27431a780c81be3098e13cf3ef0238433951d54183ad3f5d723d2eeb164d112b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ee67146994a61e4bc61bdbf2b9d2f357d6bae763b39e5ee5a9b2be29baa9286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f134a8ae91a5ceafbe6bd017d6b393ceb750cbcf834e0dd264e2d3b8f040410e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1eab9a8e03001abc24a5deab8bb650d68d7e6edccb32ee3c683279e1b3843b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee68ab3c560c46454e0cbc4a7118c40b8c6e0c9502b19e7341cfbbb318789751\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd1f8e5ac64a0316e4eff5f3b82493a2f27c17da3208f7d679447f6d31a7bec9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: E0223 06:43:48.081239 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.092823 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.103237 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b699cf9d45f42face8279a7442d6cfcb232c937bf3913b74dc42970987802723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a7491430f14a1e38e6ec715d27652713c5dcd4abd135b9858f0282e7cfbcc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.112940 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.122425 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qtbvk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27a832b0-4c0c-4f1a-9d07-770876ca1505\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee496d0837ad8cf3825c2736f9f1dfd58353010fcf4ecacdd360280ee37c7930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4znjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qtbvk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.131157 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chbhw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.141947 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8c03de4-ae89-46ed-af39-b527a31bbbef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb3b6468591bc11855d004b0b5ae0f9292185085b3f843f96cbada8f30137baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d90021108edd66868deef1a685431baf46f698cdc293f9d603de4c08a748908e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:41:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:41:20.024574 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:41:20.025551 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:41:20.026430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:41:20.027151 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:41:48.195867 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0223 06:41:50.186205 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:41:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:41:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2948892f6e9d19854d41fbe361ed6fa6b597f31a5a25a076d59e7378451a97c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f967d40becf0537406ba63da0c005ab507e8b8bac92379acd1fd456dc18eec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.151306 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca060376-b9ed-4ead-aefb-7c4c5062ef94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://544038e225f3cbc22d42b31c0832d20fcfb53907a31a737bb0dabf9beb5913a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d67877f8064c434e1d3eb335d38971522f024b8cbd04da53e38a89b3b5103ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31566e407ddc68b44b638caff0de70248d85ae4aa9266e01825fb0d4accf770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:40:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d40955b926a4cb5adffc6cd78afc0b570c44bf17d524554831b97c7dd9dff098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:40:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:40:48Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:40:48Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.160536 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://310226d56a880d1cc1288f1195a2b684bad3a59556c524cbb2d1e1f7d77dba5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.168281 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-km45b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939db1b0-d1bf-495e-a842-f0d102e2a420\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4b7fddb02dc25df3e3e676699244fe671ea0ade4a4f712db10ceb1580f726e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfwjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-km45b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.186722 4626 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4eb8735-20e6-4bd1-8965-4a360e39a919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:42:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T06:43:17Z\\\",\\\"message\\\":\\\"etry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660257 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0223 06:43:17.660263 6849 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 06:43:17.660274 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 06:43:17.660209 6849 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI0223 06:43:17.660279 6849 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660283 6849 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 06:43:17.660287 6849 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI0223 06:43:17.660289 6849 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6\\\\nI0223 06:43:17.660009 6849 event.go:377] Event(v1.ObjectRefere\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:42:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:42:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:42:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b5475\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:42:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lhplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:43:48Z is after 2025-08-24T17:21:41Z" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.981596 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.981693 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:48 crc kubenswrapper[4626]: E0223 06:43:48.981784 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:48 crc kubenswrapper[4626]: E0223 06:43:48.981855 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:48 crc kubenswrapper[4626]: I0223 06:43:48.981960 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:48 crc kubenswrapper[4626]: E0223 06:43:48.982134 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:49 crc kubenswrapper[4626]: I0223 06:43:49.051854 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:14:26.347637926 +0000 UTC Feb 23 06:43:49 crc kubenswrapper[4626]: I0223 06:43:49.982026 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:49 crc kubenswrapper[4626]: E0223 06:43:49.982202 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:50 crc kubenswrapper[4626]: I0223 06:43:50.052243 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 01:08:58.213901719 +0000 UTC Feb 23 06:43:50 crc kubenswrapper[4626]: I0223 06:43:50.981447 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:50 crc kubenswrapper[4626]: I0223 06:43:50.981551 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:50 crc kubenswrapper[4626]: E0223 06:43:50.981590 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:50 crc kubenswrapper[4626]: E0223 06:43:50.981756 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:50 crc kubenswrapper[4626]: I0223 06:43:50.981776 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:50 crc kubenswrapper[4626]: E0223 06:43:50.981891 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:51 crc kubenswrapper[4626]: I0223 06:43:51.053234 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 02:40:52.385807297 +0000 UTC Feb 23 06:43:51 crc kubenswrapper[4626]: I0223 06:43:51.981440 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:51 crc kubenswrapper[4626]: E0223 06:43:51.981656 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:52 crc kubenswrapper[4626]: I0223 06:43:52.053915 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 00:32:29.712545785 +0000 UTC Feb 23 06:43:52 crc kubenswrapper[4626]: I0223 06:43:52.981122 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:52 crc kubenswrapper[4626]: I0223 06:43:52.981165 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:52 crc kubenswrapper[4626]: I0223 06:43:52.981203 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:52 crc kubenswrapper[4626]: E0223 06:43:52.981273 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:52 crc kubenswrapper[4626]: E0223 06:43:52.981366 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:52 crc kubenswrapper[4626]: E0223 06:43:52.981469 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:53 crc kubenswrapper[4626]: I0223 06:43:53.054595 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 17:59:09.253831044 +0000 UTC Feb 23 06:43:53 crc kubenswrapper[4626]: E0223 06:43:53.082765 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:53 crc kubenswrapper[4626]: I0223 06:43:53.981610 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:53 crc kubenswrapper[4626]: E0223 06:43:53.981790 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:54 crc kubenswrapper[4626]: I0223 06:43:54.054987 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 04:17:49.472164863 +0000 UTC Feb 23 06:43:54 crc kubenswrapper[4626]: I0223 06:43:54.981447 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:54 crc kubenswrapper[4626]: I0223 06:43:54.981493 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:54 crc kubenswrapper[4626]: I0223 06:43:54.981490 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:54 crc kubenswrapper[4626]: E0223 06:43:54.981653 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:54 crc kubenswrapper[4626]: E0223 06:43:54.981822 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:54 crc kubenswrapper[4626]: E0223 06:43:54.981916 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:55 crc kubenswrapper[4626]: I0223 06:43:55.055174 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 00:15:18.437881682 +0000 UTC Feb 23 06:43:55 crc kubenswrapper[4626]: I0223 06:43:55.981549 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:55 crc kubenswrapper[4626]: E0223 06:43:55.981748 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:56 crc kubenswrapper[4626]: I0223 06:43:56.055583 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 00:30:53.58607605 +0000 UTC Feb 23 06:43:56 crc kubenswrapper[4626]: I0223 06:43:56.981349 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:56 crc kubenswrapper[4626]: I0223 06:43:56.981378 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:56 crc kubenswrapper[4626]: I0223 06:43:56.981349 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:56 crc kubenswrapper[4626]: E0223 06:43:56.981866 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:56 crc kubenswrapper[4626]: E0223 06:43:56.981979 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:56 crc kubenswrapper[4626]: E0223 06:43:56.982112 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:56 crc kubenswrapper[4626]: I0223 06:43:56.982573 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:43:56 crc kubenswrapper[4626]: E0223 06:43:56.982789 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lhplf_openshift-ovn-kubernetes(a4eb8735-20e6-4bd1-8965-4a360e39a919)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.056097 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 09:00:00.357562955 +0000 UTC Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.511733 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.511760 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.511769 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.511784 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.511793 4626 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:43:57Z","lastTransitionTime":"2026-02-23T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.548761 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw"] Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.549219 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.551568 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.551914 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.552276 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.552405 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.572265 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.572242791 podStartE2EDuration="1m27.572242791s" podCreationTimestamp="2026-02-23 06:42:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.57101088 +0000 UTC m=+189.910340147" watchObservedRunningTime="2026-02-23 06:43:57.572242791 +0000 UTC m=+189.911572057" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.606624 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/79882abb-d454-48ec-bef0-d60a2e0b81ca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.606699 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79882abb-d454-48ec-bef0-d60a2e0b81ca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.606722 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79882abb-d454-48ec-bef0-d60a2e0b81ca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.606742 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/79882abb-d454-48ec-bef0-d60a2e0b81ca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.606781 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79882abb-d454-48ec-bef0-d60a2e0b81ca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.632594 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=35.632580781 podStartE2EDuration="35.632580781s" podCreationTimestamp="2026-02-23 06:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.62294219 +0000 UTC m=+189.962271457" watchObservedRunningTime="2026-02-23 06:43:57.632580781 +0000 UTC m=+189.971910048" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.640572 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.640562826 podStartE2EDuration="55.640562826s" podCreationTimestamp="2026-02-23 06:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.633229424 +0000 UTC m=+189.972558690" watchObservedRunningTime="2026-02-23 06:43:57.640562826 +0000 UTC m=+189.979892093" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.648334 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-km45b" podStartSLOduration=122.648310755 podStartE2EDuration="2m2.648310755s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.648282192 +0000 UTC m=+189.987611458" watchObservedRunningTime="2026-02-23 06:43:57.648310755 +0000 UTC m=+189.987640021" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.677382 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qtbvk" podStartSLOduration=122.677345046 podStartE2EDuration="2m2.677345046s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.676673039 +0000 UTC m=+190.016002305" watchObservedRunningTime="2026-02-23 06:43:57.677345046 +0000 UTC m=+190.016674301" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.695376 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=103.695333059 podStartE2EDuration="1m43.695333059s" podCreationTimestamp="2026-02-23 06:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.694704345 +0000 UTC m=+190.034033611" watchObservedRunningTime="2026-02-23 06:43:57.695333059 +0000 UTC m=+190.034662315" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.706933 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=92.706892421 podStartE2EDuration="1m32.706892421s" podCreationTimestamp="2026-02-23 06:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.706246603 +0000 UTC m=+190.045575869" watchObservedRunningTime="2026-02-23 06:43:57.706892421 +0000 UTC m=+190.046221686" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.707705 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/79882abb-d454-48ec-bef0-d60a2e0b81ca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.707763 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79882abb-d454-48ec-bef0-d60a2e0b81ca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.707796 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79882abb-d454-48ec-bef0-d60a2e0b81ca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.707831 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/79882abb-d454-48ec-bef0-d60a2e0b81ca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.707898 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79882abb-d454-48ec-bef0-d60a2e0b81ca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.708035 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/79882abb-d454-48ec-bef0-d60a2e0b81ca-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.708038 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/79882abb-d454-48ec-bef0-d60a2e0b81ca-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.709213 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79882abb-d454-48ec-bef0-d60a2e0b81ca-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.720924 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79882abb-d454-48ec-bef0-d60a2e0b81ca-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.725086 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/1.log" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.725582 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/0.log" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.725618 4626 generic.go:334] "Generic (PLEG): container finished" podID="27fe907f-67db-4a19-a485-22debfb92983" containerID="fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649" exitCode=1 Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.725648 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerDied","Data":"fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649"} Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.725683 4626 scope.go:117] "RemoveContainer" containerID="ae78a35f6e9de75671f82fdb44f6f4b9b0255bc3f4f328964fd599d10fc765dd" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.726074 4626 scope.go:117] "RemoveContainer" containerID="fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649" Feb 23 06:43:57 crc kubenswrapper[4626]: E0223 06:43:57.726211 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-lbzx5_openshift-multus(27fe907f-67db-4a19-a485-22debfb92983)\"" pod="openshift-multus/multus-lbzx5" podUID="27fe907f-67db-4a19-a485-22debfb92983" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.731525 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79882abb-d454-48ec-bef0-d60a2e0b81ca-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kbcpw\" (UID: \"79882abb-d454-48ec-bef0-d60a2e0b81ca\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.758313 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podStartSLOduration=122.758298973 podStartE2EDuration="2m2.758298973s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.758133317 +0000 UTC m=+190.097462584" watchObservedRunningTime="2026-02-23 06:43:57.758298973 +0000 UTC m=+190.097628228" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.778851 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rqdv6" podStartSLOduration=121.778836369 podStartE2EDuration="2m1.778836369s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.767792198 +0000 UTC m=+190.107121463" watchObservedRunningTime="2026-02-23 06:43:57.778836369 +0000 UTC m=+190.118165626" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.812936 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-84c69" podStartSLOduration=122.812918389 podStartE2EDuration="2m2.812918389s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.812140301 +0000 UTC m=+190.151469567" watchObservedRunningTime="2026-02-23 06:43:57.812918389 +0000 UTC m=+190.152247646" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.813608 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lbzx5" podStartSLOduration=122.813604142 podStartE2EDuration="2m2.813604142s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:57.779682949 +0000 UTC m=+190.119012204" watchObservedRunningTime="2026-02-23 06:43:57.813604142 +0000 UTC m=+190.152933398" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.860528 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" Feb 23 06:43:57 crc kubenswrapper[4626]: I0223 06:43:57.982016 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:57 crc kubenswrapper[4626]: E0223 06:43:57.983040 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.056254 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:39:25.25321658 +0000 UTC Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.056332 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.064669 4626 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:43:58 crc kubenswrapper[4626]: E0223 06:43:58.083400 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.730449 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/1.log" Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.732212 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" event={"ID":"79882abb-d454-48ec-bef0-d60a2e0b81ca","Type":"ContainerStarted","Data":"c9e741691db8a6ecabeb861d03149011a6483ff04c98a3fc0f2321af157fa30a"} Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.732291 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" event={"ID":"79882abb-d454-48ec-bef0-d60a2e0b81ca","Type":"ContainerStarted","Data":"a4e338378c8810fd4b7dfcd81a7a092f291632b066964013ab99f0af15658169"} Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.981823 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.981907 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:43:58 crc kubenswrapper[4626]: I0223 06:43:58.981997 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:43:58 crc kubenswrapper[4626]: E0223 06:43:58.982121 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:43:58 crc kubenswrapper[4626]: E0223 06:43:58.982270 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:43:58 crc kubenswrapper[4626]: E0223 06:43:58.982369 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:43:59 crc kubenswrapper[4626]: I0223 06:43:59.981228 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:43:59 crc kubenswrapper[4626]: E0223 06:43:59.981462 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:00 crc kubenswrapper[4626]: I0223 06:44:00.981893 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:00 crc kubenswrapper[4626]: I0223 06:44:00.981955 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:00 crc kubenswrapper[4626]: E0223 06:44:00.982038 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:00 crc kubenswrapper[4626]: I0223 06:44:00.981970 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:00 crc kubenswrapper[4626]: E0223 06:44:00.982154 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:00 crc kubenswrapper[4626]: E0223 06:44:00.982438 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:01 crc kubenswrapper[4626]: I0223 06:44:01.981673 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:01 crc kubenswrapper[4626]: E0223 06:44:01.981853 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:02 crc kubenswrapper[4626]: I0223 06:44:02.981023 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:02 crc kubenswrapper[4626]: I0223 06:44:02.981047 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:02 crc kubenswrapper[4626]: I0223 06:44:02.981042 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:02 crc kubenswrapper[4626]: E0223 06:44:02.981181 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:02 crc kubenswrapper[4626]: E0223 06:44:02.981314 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:02 crc kubenswrapper[4626]: E0223 06:44:02.981450 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:03 crc kubenswrapper[4626]: E0223 06:44:03.085009 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:44:03 crc kubenswrapper[4626]: I0223 06:44:03.982108 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:03 crc kubenswrapper[4626]: E0223 06:44:03.982289 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:04 crc kubenswrapper[4626]: I0223 06:44:04.981611 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:04 crc kubenswrapper[4626]: I0223 06:44:04.981695 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:04 crc kubenswrapper[4626]: E0223 06:44:04.981747 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:04 crc kubenswrapper[4626]: I0223 06:44:04.981695 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:04 crc kubenswrapper[4626]: E0223 06:44:04.981906 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:04 crc kubenswrapper[4626]: E0223 06:44:04.981942 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:05 crc kubenswrapper[4626]: I0223 06:44:05.981258 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:05 crc kubenswrapper[4626]: E0223 06:44:05.981396 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:06 crc kubenswrapper[4626]: I0223 06:44:06.981086 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:06 crc kubenswrapper[4626]: I0223 06:44:06.981159 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:06 crc kubenswrapper[4626]: I0223 06:44:06.981202 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:06 crc kubenswrapper[4626]: E0223 06:44:06.981268 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:06 crc kubenswrapper[4626]: E0223 06:44:06.981410 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:06 crc kubenswrapper[4626]: E0223 06:44:06.981527 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:07 crc kubenswrapper[4626]: I0223 06:44:07.982027 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:07 crc kubenswrapper[4626]: E0223 06:44:07.982849 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:08 crc kubenswrapper[4626]: E0223 06:44:08.085645 4626 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:44:08 crc kubenswrapper[4626]: I0223 06:44:08.981191 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:08 crc kubenswrapper[4626]: I0223 06:44:08.981233 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:08 crc kubenswrapper[4626]: I0223 06:44:08.981190 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:08 crc kubenswrapper[4626]: E0223 06:44:08.981566 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:08 crc kubenswrapper[4626]: I0223 06:44:08.981621 4626 scope.go:117] "RemoveContainer" containerID="fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649" Feb 23 06:44:08 crc kubenswrapper[4626]: E0223 06:44:08.981874 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:08 crc kubenswrapper[4626]: E0223 06:44:08.982182 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:08 crc kubenswrapper[4626]: I0223 06:44:08.982438 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:44:08 crc kubenswrapper[4626]: I0223 06:44:08.998606 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kbcpw" podStartSLOduration=133.998588015 podStartE2EDuration="2m13.998588015s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:43:58.748103987 +0000 UTC m=+191.087433254" watchObservedRunningTime="2026-02-23 06:44:08.998588015 +0000 UTC m=+201.337917271" Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.623303 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ls5wf"] Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.623448 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:09 crc kubenswrapper[4626]: E0223 06:44:09.623596 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.764710 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/3.log" Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.766465 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerStarted","Data":"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c"} Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.767648 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.771552 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/1.log" Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.771601 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerStarted","Data":"9a25087115100c9626d3a1eafde3dd594af1266341b73b36a08abdb447c9395e"} Feb 23 06:44:09 crc kubenswrapper[4626]: I0223 06:44:09.806054 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podStartSLOduration=134.806023748 podStartE2EDuration="2m14.806023748s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:09.794905231 +0000 UTC m=+202.134234497" watchObservedRunningTime="2026-02-23 06:44:09.806023748 +0000 UTC m=+202.145353014" Feb 23 06:44:10 crc kubenswrapper[4626]: I0223 06:44:10.981375 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:10 crc kubenswrapper[4626]: I0223 06:44:10.981434 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:10 crc kubenswrapper[4626]: I0223 06:44:10.981396 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:10 crc kubenswrapper[4626]: I0223 06:44:10.981372 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:10 crc kubenswrapper[4626]: E0223 06:44:10.981546 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:10 crc kubenswrapper[4626]: E0223 06:44:10.981609 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:10 crc kubenswrapper[4626]: E0223 06:44:10.981802 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:10 crc kubenswrapper[4626]: E0223 06:44:10.981880 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:12 crc kubenswrapper[4626]: I0223 06:44:12.981613 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:12 crc kubenswrapper[4626]: I0223 06:44:12.981668 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:12 crc kubenswrapper[4626]: I0223 06:44:12.981709 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:12 crc kubenswrapper[4626]: E0223 06:44:12.981750 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:44:12 crc kubenswrapper[4626]: I0223 06:44:12.981673 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:12 crc kubenswrapper[4626]: E0223 06:44:12.981843 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:44:12 crc kubenswrapper[4626]: E0223 06:44:12.981964 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5wf" podUID="53b6af64-b3dc-44ae-96bd-90ab1b79dc08" Feb 23 06:44:12 crc kubenswrapper[4626]: E0223 06:44:12.982007 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.981759 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.981787 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.981781 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.981881 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.984679 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.984679 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.984854 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.985945 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.985948 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 06:44:14 crc kubenswrapper[4626]: I0223 06:44:14.985971 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 06:44:17 crc kubenswrapper[4626]: I0223 06:44:17.975827 4626 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.010662 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mpcw2"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.011199 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.011591 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zc6wr"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.011856 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.012099 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qnh2c"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.012662 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.013072 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xc7q9"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.013528 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.013747 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.014191 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017606 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017794 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017847 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017849 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017932 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017947 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.017991 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.018229 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.018323 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.018587 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.018818 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.018822 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.019126 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.019263 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.019592 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.021291 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.021629 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.021888 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.021962 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.025585 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.025982 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.026818 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.026858 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.027186 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.027194 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.028192 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.031592 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8j9bj"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.032921 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.033356 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.033856 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.039822 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.050298 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.050478 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.050570 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.051676 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.051782 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.051906 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.051958 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052012 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052061 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052071 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052082 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052183 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.051794 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052013 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.052615 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.054733 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.055071 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.055525 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5d8pq"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.055947 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.056724 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.057396 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.058176 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.058494 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.058831 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.059287 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.059717 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k27xh"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.060155 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.060212 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ktkzv"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.060529 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.062896 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.064242 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.064465 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.065527 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.065871 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.066348 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.066481 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067155 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pchl5"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067220 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067449 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xgpg7"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067466 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067717 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dmjfl"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067760 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.067952 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.068199 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.068285 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.068334 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.068443 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.070396 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-4bfst"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.070730 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.074202 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.074669 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.074997 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.075157 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.075279 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.075472 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.075634 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.079969 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.085812 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.085954 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.086102 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.086229 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.086873 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087352 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087515 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087730 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087748 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087841 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087887 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.087960 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.088064 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.088082 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.088116 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.094391 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.095017 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.095366 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.095603 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.095977 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.097615 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.098147 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.098329 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.098615 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.098906 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100100 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-images\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100135 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-779rv\" (UniqueName: \"kubernetes.io/projected/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-kube-api-access-779rv\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100160 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-audit-policies\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100183 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-trusted-ca\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100200 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b076e5-da87-4a61-8782-f5f08732ece9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100266 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15b076e5-da87-4a61-8782-f5f08732ece9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100291 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb155a06-d310-4f44-910d-96b02a5ed13b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100309 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-trusted-ca-bundle\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100391 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-metrics-certs\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100508 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-oauth-config\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100535 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4de0065f-e1a8-4dad-8c71-310890524a41-metrics-tls\") pod \"dns-operator-744455d44c-dmjfl\" (UID: \"4de0065f-e1a8-4dad-8c71-310890524a41\") " pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100558 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100580 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100598 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100621 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4xq6\" (UniqueName: \"kubernetes.io/projected/f58bc597-63a9-4742-9985-236825024833-kube-api-access-j4xq6\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100663 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-config\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100690 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-client-ca\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100708 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9c2\" (UniqueName: \"kubernetes.io/projected/1957912e-d933-428f-98b3-65bb43ca2ad0-kube-api-access-vp9c2\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100729 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-config\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100775 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39684a26-7fad-4a8b-9621-99db77c9a01f-service-ca-bundle\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100811 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100835 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6z44\" (UniqueName: \"kubernetes.io/projected/878646c5-5965-4cb2-909b-14737533f2a1-kube-api-access-x6z44\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100862 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100882 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100882 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vnmg\" (UniqueName: \"kubernetes.io/projected/7728d23e-9cc9-424f-8773-a55ba1e7b940-kube-api-access-9vnmg\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100920 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76jc\" (UniqueName: \"kubernetes.io/projected/39684a26-7fad-4a8b-9621-99db77c9a01f-kube-api-access-x76jc\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100942 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f58bc597-63a9-4742-9985-236825024833-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.100958 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-config\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.101044 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b076e5-da87-4a61-8782-f5f08732ece9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.101100 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-b4q5j"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.104092 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.104576 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.104776 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-stats-auth\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.104836 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-serving-cert\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.112400 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114149 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114375 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-ca\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114422 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114438 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114444 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9f4h\" (UniqueName: \"kubernetes.io/projected/94e222b5-d52d-46c9-ac45-1a0158bdd383-kube-api-access-w9f4h\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114465 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-audit-dir\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114509 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85sh\" (UniqueName: \"kubernetes.io/projected/82862b64-d9a7-4970-b6d7-72afdf910e70-kube-api-access-r85sh\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114527 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-encryption-config\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114542 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-service-ca\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114557 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114635 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-etcd-client\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114668 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnflg\" (UniqueName: \"kubernetes.io/projected/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-kube-api-access-dnflg\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114761 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-serving-cert\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114802 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114808 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-service-ca\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114833 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb155a06-d310-4f44-910d-96b02a5ed13b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114849 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-oauth-serving-cert\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114866 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgp6\" (UniqueName: \"kubernetes.io/projected/4abfb5ed-4161-41d1-9cb5-70a93c60e109-kube-api-access-4fgp6\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114882 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/439f9126-bf49-4c43-aef4-c993cd5d818f-audit-dir\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114919 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/da93645e-0946-47f3-8edd-933148cbc8d2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c8plh\" (UID: \"da93645e-0946-47f3-8edd-933148cbc8d2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114938 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6rd\" (UniqueName: \"kubernetes.io/projected/f9674062-fec2-47d4-af0e-a7c268402ee2-kube-api-access-dt6rd\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114959 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wr9d\" (UniqueName: \"kubernetes.io/projected/fb155a06-d310-4f44-910d-96b02a5ed13b-kube-api-access-2wr9d\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114975 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f58bc597-63a9-4742-9985-236825024833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114993 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7728d23e-9cc9-424f-8773-a55ba1e7b940-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115018 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/82862b64-d9a7-4970-b6d7-72afdf910e70-machine-approver-tls\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115049 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115068 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-serving-cert\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.114921 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115087 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115104 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f8d72e-3f0a-41a6-813d-2538046dff59-config\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115118 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115133 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9674062-fec2-47d4-af0e-a7c268402ee2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115149 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-encryption-config\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115163 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f8d72e-3f0a-41a6-813d-2538046dff59-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115072 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115185 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-config\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115201 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-policies\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115214 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-default-certificate\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115231 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxspq\" (UniqueName: \"kubernetes.io/projected/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-kube-api-access-vxspq\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115246 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/439f9126-bf49-4c43-aef4-c993cd5d818f-node-pullsecrets\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115260 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-audit\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115276 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115292 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115307 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/878646c5-5965-4cb2-909b-14737533f2a1-serving-cert\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115327 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prphz\" (UniqueName: \"kubernetes.io/projected/4de0065f-e1a8-4dad-8c71-310890524a41-kube-api-access-prphz\") pod \"dns-operator-744455d44c-dmjfl\" (UID: \"4de0065f-e1a8-4dad-8c71-310890524a41\") " pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115348 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncm4c\" (UniqueName: \"kubernetes.io/projected/439f9126-bf49-4c43-aef4-c993cd5d818f-kube-api-access-ncm4c\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115366 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-serving-cert\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115382 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7728d23e-9cc9-424f-8773-a55ba1e7b940-serving-cert\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115398 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-884bg\" (UniqueName: \"kubernetes.io/projected/919789ac-a13f-430c-a00c-5ab73f8e8cba-kube-api-access-884bg\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115418 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115514 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndfk\" (UniqueName: \"kubernetes.io/projected/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-kube-api-access-jndfk\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115532 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-image-import-ca\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115549 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-config\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115564 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115646 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1957912e-d933-428f-98b3-65bb43ca2ad0-serving-cert\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115651 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115687 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115706 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82862b64-d9a7-4970-b6d7-72afdf910e70-auth-proxy-config\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115731 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-config\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115776 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-etcd-serving-ca\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.115979 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f58bc597-63a9-4742-9985-236825024833-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116003 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-config\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116156 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9674062-fec2-47d4-af0e-a7c268402ee2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116188 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116264 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e222b5-d52d-46c9-ac45-1a0158bdd383-serving-cert\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116298 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-client\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116346 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116368 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-dir\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116387 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116448 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-serving-cert\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116486 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-etcd-client\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116554 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116565 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116567 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116684 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82862b64-d9a7-4970-b6d7-72afdf910e70-config\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116700 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96f8d72e-3f0a-41a6-813d-2538046dff59-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116721 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7st\" (UniqueName: \"kubernetes.io/projected/da93645e-0946-47f3-8edd-933148cbc8d2-kube-api-access-hn7st\") pod \"cluster-samples-operator-665b6dd947-c8plh\" (UID: \"da93645e-0946-47f3-8edd-933148cbc8d2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116734 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-client-ca\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116749 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116750 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-config\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.116968 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.118138 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.118258 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.118350 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.118447 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.118757 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.119192 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.119251 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.119381 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.119392 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.121113 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.123634 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.124032 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.124326 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.124464 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.125152 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.125733 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.125954 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.126139 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.131997 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.132958 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.139964 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cqdrn"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.140269 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.140462 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.148861 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.155735 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.161644 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.162044 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vs6nc"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.162336 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.162628 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.162917 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163288 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ggnc8"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163656 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163712 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163808 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163857 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163882 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163901 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.163948 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.165128 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.165571 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.166062 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mpcw2"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.168189 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.168820 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.168849 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zc6wr"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.170091 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hw7rq"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.170570 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.171053 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qnh2c"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.174635 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xc7q9"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.174659 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8j9bj"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.174669 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.176191 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.177890 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.185596 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.186934 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.189248 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.191446 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.195668 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.197174 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.198908 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pchl5"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.198945 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.200570 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k27xh"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.202967 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5d8pq"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.204113 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dmjfl"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.207058 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vs6nc"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.208694 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.208866 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ktkzv"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.209196 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.211145 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cqdrn"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.211836 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.213492 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.217376 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.217944 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-audit-dir\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.217983 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r85sh\" (UniqueName: \"kubernetes.io/projected/82862b64-d9a7-4970-b6d7-72afdf910e70-kube-api-access-r85sh\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218007 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-etcd-client\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218032 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-encryption-config\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218049 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-service-ca\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218067 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218092 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-serving-cert\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218108 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-service-ca\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218123 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnflg\" (UniqueName: \"kubernetes.io/projected/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-kube-api-access-dnflg\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218141 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgp6\" (UniqueName: \"kubernetes.io/projected/4abfb5ed-4161-41d1-9cb5-70a93c60e109-kube-api-access-4fgp6\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218155 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/439f9126-bf49-4c43-aef4-c993cd5d818f-audit-dir\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218169 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb155a06-d310-4f44-910d-96b02a5ed13b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218185 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-oauth-serving-cert\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218203 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6rd\" (UniqueName: \"kubernetes.io/projected/f9674062-fec2-47d4-af0e-a7c268402ee2-kube-api-access-dt6rd\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218227 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/da93645e-0946-47f3-8edd-933148cbc8d2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c8plh\" (UID: \"da93645e-0946-47f3-8edd-933148cbc8d2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218244 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7728d23e-9cc9-424f-8773-a55ba1e7b940-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218259 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/82862b64-d9a7-4970-b6d7-72afdf910e70-machine-approver-tls\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218280 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wr9d\" (UniqueName: \"kubernetes.io/projected/fb155a06-d310-4f44-910d-96b02a5ed13b-kube-api-access-2wr9d\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218298 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f58bc597-63a9-4742-9985-236825024833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218315 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218332 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-serving-cert\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218347 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218361 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9674062-fec2-47d4-af0e-a7c268402ee2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218378 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-encryption-config\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218393 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f8d72e-3f0a-41a6-813d-2538046dff59-config\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218412 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f8d72e-3f0a-41a6-813d-2538046dff59-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218427 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-default-certificate\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218442 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-config\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218457 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-policies\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218473 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxspq\" (UniqueName: \"kubernetes.io/projected/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-kube-api-access-vxspq\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218487 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/439f9126-bf49-4c43-aef4-c993cd5d818f-node-pullsecrets\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218521 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-audit\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218540 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218558 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218585 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-serving-cert\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218602 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7728d23e-9cc9-424f-8773-a55ba1e7b940-serving-cert\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218620 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/878646c5-5965-4cb2-909b-14737533f2a1-serving-cert\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218640 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prphz\" (UniqueName: \"kubernetes.io/projected/4de0065f-e1a8-4dad-8c71-310890524a41-kube-api-access-prphz\") pod \"dns-operator-744455d44c-dmjfl\" (UID: \"4de0065f-e1a8-4dad-8c71-310890524a41\") " pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218655 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncm4c\" (UniqueName: \"kubernetes.io/projected/439f9126-bf49-4c43-aef4-c993cd5d818f-kube-api-access-ncm4c\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218671 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-884bg\" (UniqueName: \"kubernetes.io/projected/919789ac-a13f-430c-a00c-5ab73f8e8cba-kube-api-access-884bg\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218687 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-config\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218703 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218721 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218746 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndfk\" (UniqueName: \"kubernetes.io/projected/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-kube-api-access-jndfk\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218764 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-image-import-ca\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218779 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82862b64-d9a7-4970-b6d7-72afdf910e70-auth-proxy-config\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218795 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-config\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218811 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1957912e-d933-428f-98b3-65bb43ca2ad0-serving-cert\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218828 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218857 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-config\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218874 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9674062-fec2-47d4-af0e-a7c268402ee2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218891 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-etcd-serving-ca\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218917 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f58bc597-63a9-4742-9985-236825024833-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218935 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218953 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218969 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e222b5-d52d-46c9-ac45-1a0158bdd383-serving-cert\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.218987 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-client\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219002 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219019 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-dir\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219045 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-serving-cert\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219060 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-etcd-client\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219079 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82862b64-d9a7-4970-b6d7-72afdf910e70-config\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219094 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96f8d72e-3f0a-41a6-813d-2538046dff59-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219110 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219128 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7st\" (UniqueName: \"kubernetes.io/projected/da93645e-0946-47f3-8edd-933148cbc8d2-kube-api-access-hn7st\") pod \"cluster-samples-operator-665b6dd947-c8plh\" (UID: \"da93645e-0946-47f3-8edd-933148cbc8d2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219147 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-client-ca\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219165 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-config\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219180 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-images\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219197 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-779rv\" (UniqueName: \"kubernetes.io/projected/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-kube-api-access-779rv\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219214 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b076e5-da87-4a61-8782-f5f08732ece9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219230 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15b076e5-da87-4a61-8782-f5f08732ece9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219245 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-audit-policies\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219262 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-trusted-ca\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219281 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb155a06-d310-4f44-910d-96b02a5ed13b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219297 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-trusted-ca-bundle\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219314 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-metrics-certs\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219331 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219348 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-oauth-config\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219363 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4de0065f-e1a8-4dad-8c71-310890524a41-metrics-tls\") pod \"dns-operator-744455d44c-dmjfl\" (UID: \"4de0065f-e1a8-4dad-8c71-310890524a41\") " pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219380 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219395 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219417 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4xq6\" (UniqueName: \"kubernetes.io/projected/f58bc597-63a9-4742-9985-236825024833-kube-api-access-j4xq6\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219433 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-config\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219452 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-client-ca\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219471 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9c2\" (UniqueName: \"kubernetes.io/projected/1957912e-d933-428f-98b3-65bb43ca2ad0-kube-api-access-vp9c2\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219487 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-config\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219518 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39684a26-7fad-4a8b-9621-99db77c9a01f-service-ca-bundle\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219543 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219561 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6z44\" (UniqueName: \"kubernetes.io/projected/878646c5-5965-4cb2-909b-14737533f2a1-kube-api-access-x6z44\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219586 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219603 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vnmg\" (UniqueName: \"kubernetes.io/projected/7728d23e-9cc9-424f-8773-a55ba1e7b940-kube-api-access-9vnmg\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219619 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76jc\" (UniqueName: \"kubernetes.io/projected/39684a26-7fad-4a8b-9621-99db77c9a01f-kube-api-access-x76jc\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219637 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b076e5-da87-4a61-8782-f5f08732ece9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219653 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-stats-auth\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219673 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f58bc597-63a9-4742-9985-236825024833-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219691 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-config\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219707 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219728 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-serving-cert\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219743 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-ca\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.219760 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9f4h\" (UniqueName: \"kubernetes.io/projected/94e222b5-d52d-46c9-ac45-1a0158bdd383-kube-api-access-w9f4h\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.220433 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-image-import-ca\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.220511 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-audit-dir\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.222279 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-config\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.225631 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82862b64-d9a7-4970-b6d7-72afdf910e70-auth-proxy-config\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.226376 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-encryption-config\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.226899 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-service-ca\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.227485 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-config\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.227480 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.230556 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.230569 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.230581 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.231351 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xgpg7"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.232220 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rtqxx"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.232304 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-policies\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.232436 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/439f9126-bf49-4c43-aef4-c993cd5d818f-node-pullsecrets\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.232863 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-audit\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.232933 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.233287 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.233634 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7728d23e-9cc9-424f-8773-a55ba1e7b940-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.233795 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.234462 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-service-ca\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.235194 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb155a06-d310-4f44-910d-96b02a5ed13b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.235255 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/439f9126-bf49-4c43-aef4-c993cd5d818f-audit-dir\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.235385 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-config\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.236632 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.236736 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pj6n8"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.237475 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b4q5j"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.237575 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.237802 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.238250 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9674062-fec2-47d4-af0e-a7c268402ee2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.238812 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-oauth-serving-cert\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.242197 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ggnc8"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.242222 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.242233 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rtqxx"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.246572 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.246597 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.246609 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.248507 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pj6n8"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.248525 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fjnzs"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.249436 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.249726 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1957912e-d933-428f-98b3-65bb43ca2ad0-serving-cert\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.250602 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fjnzs"] Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.250621 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-config\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.250847 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.251051 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.251592 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.252908 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-client-ca\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.253741 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/da93645e-0946-47f3-8edd-933148cbc8d2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c8plh\" (UID: \"da93645e-0946-47f3-8edd-933148cbc8d2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.254010 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.254304 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.254329 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/878646c5-5965-4cb2-909b-14737533f2a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.254386 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-dir\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.254450 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.254890 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-trusted-ca-bundle\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.255394 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.255406 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-serving-cert\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.255664 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.258192 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82862b64-d9a7-4970-b6d7-72afdf910e70-config\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.256389 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-serving-cert\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.256490 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-serving-cert\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.256555 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/82862b64-d9a7-4970-b6d7-72afdf910e70-machine-approver-tls\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.256695 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7728d23e-9cc9-424f-8773-a55ba1e7b940-serving-cert\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.256922 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.257006 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-etcd-serving-ca\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.257018 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-etcd-client\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.257379 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9674062-fec2-47d4-af0e-a7c268402ee2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.258351 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb155a06-d310-4f44-910d-96b02a5ed13b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.258468 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/878646c5-5965-4cb2-909b-14737533f2a1-serving-cert\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.258872 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-config\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.256236 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-images\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.259399 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-client-ca\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.259439 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/439f9126-bf49-4c43-aef4-c993cd5d818f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.259947 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.259950 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b076e5-da87-4a61-8782-f5f08732ece9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.260056 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-config\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.260391 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-audit-policies\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.260635 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.260868 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-ca\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.261665 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.262095 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-config\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.262212 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-etcd-client\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.262626 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-etcd-client\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.262717 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f58bc597-63a9-4742-9985-236825024833-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.262820 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.263206 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.263233 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.263316 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-oauth-config\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.263445 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e222b5-d52d-46c9-ac45-1a0158bdd383-serving-cert\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.264278 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-encryption-config\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.264583 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f58bc597-63a9-4742-9985-236825024833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.264702 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b076e5-da87-4a61-8782-f5f08732ece9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.267058 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f9126-bf49-4c43-aef4-c993cd5d818f-serving-cert\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.272623 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.297726 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.312449 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.332649 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.336646 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-serving-cert\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.353145 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.360036 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4de0065f-e1a8-4dad-8c71-310890524a41-metrics-tls\") pod \"dns-operator-744455d44c-dmjfl\" (UID: \"4de0065f-e1a8-4dad-8c71-310890524a41\") " pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.373186 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.392999 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.412913 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.432538 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.458190 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.462562 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-trusted-ca\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.474072 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.492705 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.502839 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-config\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.512963 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.532936 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.553079 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.560528 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-default-certificate\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.572673 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.579895 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-stats-auth\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.592427 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.597660 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39684a26-7fad-4a8b-9621-99db77c9a01f-metrics-certs\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.613568 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.632795 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.635815 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39684a26-7fad-4a8b-9621-99db77c9a01f-service-ca-bundle\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.653562 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.672815 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.682295 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96f8d72e-3f0a-41a6-813d-2538046dff59-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.692486 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.733349 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.753049 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.772757 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.792346 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.818400 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.838629 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.847282 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96f8d72e-3f0a-41a6-813d-2538046dff59-config\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.853516 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.873283 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.893237 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.913327 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.933443 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.954018 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 06:44:18 crc kubenswrapper[4626]: I0223 06:44:18.992880 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.013355 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.033272 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.052641 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.073203 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.092656 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.112694 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.132086 4626 request.go:700] Waited for 1.005633468s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.132976 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.152993 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.172953 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.193838 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.212772 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.233518 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.253205 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.272974 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.293190 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.312649 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.332615 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.353322 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.373016 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.393284 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.412828 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.433100 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.453128 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.478005 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.493438 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.513839 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.533525 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.553175 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.573150 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.593126 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.612799 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.632494 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.653172 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.672889 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.693635 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.713068 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.734552 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.753672 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.773556 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.793236 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.813303 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.833443 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.853265 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.888142 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9f4h\" (UniqueName: \"kubernetes.io/projected/94e222b5-d52d-46c9-ac45-1a0158bdd383-kube-api-access-w9f4h\") pod \"route-controller-manager-6576b87f9c-97rfn\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.904975 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r85sh\" (UniqueName: \"kubernetes.io/projected/82862b64-d9a7-4970-b6d7-72afdf910e70-kube-api-access-r85sh\") pod \"machine-approver-56656f9798-ljz6r\" (UID: \"82862b64-d9a7-4970-b6d7-72afdf910e70\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.924802 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96f8d72e-3f0a-41a6-813d-2538046dff59-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r5txd\" (UID: \"96f8d72e-3f0a-41a6-813d-2538046dff59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.926592 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.934848 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:19 crc kubenswrapper[4626]: W0223 06:44:19.938687 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82862b64_d9a7_4970_b6d7_72afdf910e70.slice/crio-40d32ce55b241096d4f566bf2b69f8ed912136fd06afc58f47df41320220bf7d WatchSource:0}: Error finding container 40d32ce55b241096d4f566bf2b69f8ed912136fd06afc58f47df41320220bf7d: Status 404 returned error can't find the container with id 40d32ce55b241096d4f566bf2b69f8ed912136fd06afc58f47df41320220bf7d Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.945269 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxspq\" (UniqueName: \"kubernetes.io/projected/a8528c5d-e8a3-43e6-b17b-2adee2bcc66e-kube-api-access-vxspq\") pod \"etcd-operator-b45778765-5d8pq\" (UID: \"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.953383 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.972117 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.972948 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 06:44:19 crc kubenswrapper[4626]: I0223 06:44:19.993189 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.013269 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.058709 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgp6\" (UniqueName: \"kubernetes.io/projected/4abfb5ed-4161-41d1-9cb5-70a93c60e109-kube-api-access-4fgp6\") pod \"console-f9d7485db-ktkzv\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.075464 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnflg\" (UniqueName: \"kubernetes.io/projected/03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8-kube-api-access-dnflg\") pod \"apiserver-7bbb656c7d-rqbt5\" (UID: \"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.099292 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wr9d\" (UniqueName: \"kubernetes.io/projected/fb155a06-d310-4f44-910d-96b02a5ed13b-kube-api-access-2wr9d\") pod \"openshift-controller-manager-operator-756b6f6bc6-vd8n9\" (UID: \"fb155a06-d310-4f44-910d-96b02a5ed13b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.102281 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.107034 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prphz\" (UniqueName: \"kubernetes.io/projected/4de0065f-e1a8-4dad-8c71-310890524a41-kube-api-access-prphz\") pod \"dns-operator-744455d44c-dmjfl\" (UID: \"4de0065f-e1a8-4dad-8c71-310890524a41\") " pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.128642 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncm4c\" (UniqueName: \"kubernetes.io/projected/439f9126-bf49-4c43-aef4-c993cd5d818f-kube-api-access-ncm4c\") pod \"apiserver-76f77b778f-mpcw2\" (UID: \"439f9126-bf49-4c43-aef4-c993cd5d818f\") " pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.132520 4626 request.go:700] Waited for 1.8975063s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.148580 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-884bg\" (UniqueName: \"kubernetes.io/projected/919789ac-a13f-430c-a00c-5ab73f8e8cba-kube-api-access-884bg\") pod \"oauth-openshift-558db77b4-8j9bj\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.157234 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.166052 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.186125 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.187204 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5d8pq"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.193677 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.233387 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6rd\" (UniqueName: \"kubernetes.io/projected/f9674062-fec2-47d4-af0e-a7c268402ee2-kube-api-access-dt6rd\") pod \"openshift-apiserver-operator-796bbdcf4f-dz677\" (UID: \"f9674062-fec2-47d4-af0e-a7c268402ee2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.241729 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.250166 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.258777 4626 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.259039 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.287128 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.289712 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.290677 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.315301 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndfk\" (UniqueName: \"kubernetes.io/projected/0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a-kube-api-access-jndfk\") pod \"console-operator-58897d9998-pchl5\" (UID: \"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a\") " pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.320074 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.330619 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7st\" (UniqueName: \"kubernetes.io/projected/da93645e-0946-47f3-8edd-933148cbc8d2-kube-api-access-hn7st\") pod \"cluster-samples-operator-665b6dd947-c8plh\" (UID: \"da93645e-0946-47f3-8edd-933148cbc8d2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.335074 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.348222 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-779rv\" (UniqueName: \"kubernetes.io/projected/77d47a11-2a3a-4803-8f4b-3bfe07c27e00-kube-api-access-779rv\") pod \"machine-api-operator-5694c8668f-qnh2c\" (UID: \"77d47a11-2a3a-4803-8f4b-3bfe07c27e00\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:20 crc kubenswrapper[4626]: W0223 06:44:20.349903 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96f8d72e_3f0a_41a6_813d_2538046dff59.slice/crio-dc57c2beee85dacfd4e3ddee8a4f319f3a7eef54d1042bd154a512f88dc04359 WatchSource:0}: Error finding container dc57c2beee85dacfd4e3ddee8a4f319f3a7eef54d1042bd154a512f88dc04359: Status 404 returned error can't find the container with id dc57c2beee85dacfd4e3ddee8a4f319f3a7eef54d1042bd154a512f88dc04359 Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.366747 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f58bc597-63a9-4742-9985-236825024833-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.388995 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.389048 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.390429 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6z44\" (UniqueName: \"kubernetes.io/projected/878646c5-5965-4cb2-909b-14737533f2a1-kube-api-access-x6z44\") pod \"authentication-operator-69f744f599-xc7q9\" (UID: \"878646c5-5965-4cb2-909b-14737533f2a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.412460 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4xq6\" (UniqueName: \"kubernetes.io/projected/f58bc597-63a9-4742-9985-236825024833-kube-api-access-j4xq6\") pod \"cluster-image-registry-operator-dc59b4c8b-848ld\" (UID: \"f58bc597-63a9-4742-9985-236825024833\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.423370 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.430282 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vnmg\" (UniqueName: \"kubernetes.io/projected/7728d23e-9cc9-424f-8773-a55ba1e7b940-kube-api-access-9vnmg\") pod \"openshift-config-operator-7777fb866f-k27xh\" (UID: \"7728d23e-9cc9-424f-8773-a55ba1e7b940\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.450284 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9c2\" (UniqueName: \"kubernetes.io/projected/1957912e-d933-428f-98b3-65bb43ca2ad0-kube-api-access-vp9c2\") pod \"controller-manager-879f6c89f-zc6wr\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.456339 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.468877 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.472703 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15b076e5-da87-4a61-8782-f5f08732ece9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mr26t\" (UID: \"15b076e5-da87-4a61-8782-f5f08732ece9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.476448 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.488527 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76jc\" (UniqueName: \"kubernetes.io/projected/39684a26-7fad-4a8b-9621-99db77c9a01f-kube-api-access-x76jc\") pod \"router-default-5444994796-4bfst\" (UID: \"39684a26-7fad-4a8b-9621-99db77c9a01f\") " pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.504694 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8j9bj"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.535848 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9"] Feb 23 06:44:20 crc kubenswrapper[4626]: W0223 06:44:20.547827 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919789ac_a13f_430c_a00c_5ab73f8e8cba.slice/crio-d0761d5522b0826edfff4027d21666f1181d0d9a00055424be2bde7e73397c74 WatchSource:0}: Error finding container d0761d5522b0826edfff4027d21666f1181d0d9a00055424be2bde7e73397c74: Status 404 returned error can't find the container with id d0761d5522b0826edfff4027d21666f1181d0d9a00055424be2bde7e73397c74 Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557747 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9xj\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-kube-api-access-cc9xj\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557792 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557836 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e5e4e49-7bad-4e99-a662-b0f4ca041477-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557865 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-bound-sa-token\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557888 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-trusted-ca\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557917 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-certificates\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557934 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e5e4e49-7bad-4e99-a662-b0f4ca041477-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.557976 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-tls\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.558269 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.058258729 +0000 UTC m=+213.397587995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.584711 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.606357 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659083 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659307 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e5e4e49-7bad-4e99-a662-b0f4ca041477-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659355 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c535fdcc-06de-4e87-a104-962f422a8df0-metrics-tls\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659391 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-node-bootstrap-token\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659407 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45889c6c-eea4-447e-85a6-8045ba5a3fae-apiservice-cert\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659436 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-serving-cert\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659483 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hg9k\" (UniqueName: \"kubernetes.io/projected/45889c6c-eea4-447e-85a6-8045ba5a3fae-kube-api-access-9hg9k\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659527 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjmvt\" (UniqueName: \"kubernetes.io/projected/7de9fe51-8926-4966-85f1-b14c16db8a74-kube-api-access-wjmvt\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659561 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-trusted-ca\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659622 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7de9fe51-8926-4966-85f1-b14c16db8a74-secret-volume\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659642 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db82ca2b-5eac-4858-8808-7b6e22af0e26-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q9kk6\" (UID: \"db82ca2b-5eac-4858-8808-7b6e22af0e26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659670 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcs5k\" (UniqueName: \"kubernetes.io/projected/b4bf205f-74bd-4aa7-879c-f034a0fd8465-kube-api-access-vcs5k\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659686 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659703 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca0fe033-a085-405b-a425-3ddc1f5e7e39-metrics-tls\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659735 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ec90488-5451-4df4-b9d6-926ced491b80-proxy-tls\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659750 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e543e751-687f-4a05-b3b0-22733a274f7f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659787 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh8tp\" (UniqueName: \"kubernetes.io/projected/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-kube-api-access-xh8tp\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659830 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm4x4\" (UniqueName: \"kubernetes.io/projected/fbcd0737-551b-4d4b-bd69-ab6e324fe199-kube-api-access-zm4x4\") pod \"package-server-manager-789f6589d5-6rz8s\" (UID: \"fbcd0737-551b-4d4b-bd69-ab6e324fe199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659845 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhj8\" (UniqueName: \"kubernetes.io/projected/cf1ed808-7fba-44e8-9722-4b87c503ee9c-kube-api-access-2dhj8\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659858 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81b9ec25-c107-4b28-9b74-bc36564db58c-cert\") pod \"ingress-canary-rtqxx\" (UID: \"81b9ec25-c107-4b28-9b74-bc36564db58c\") " pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659899 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm2v8\" (UniqueName: \"kubernetes.io/projected/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-kube-api-access-tm2v8\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659913 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfk4q\" (UniqueName: \"kubernetes.io/projected/32d57d80-6063-45cc-9e60-3f72f601cd83-kube-api-access-xfk4q\") pod \"migrator-59844c95c7-7jqjk\" (UID: \"32d57d80-6063-45cc-9e60-3f72f601cd83\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.659955 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt4pr\" (UniqueName: \"kubernetes.io/projected/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-kube-api-access-xt4pr\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663049 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45889c6c-eea4-447e-85a6-8045ba5a3fae-webhook-cert\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663091 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6s7x\" (UniqueName: \"kubernetes.io/projected/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-kube-api-access-l6s7x\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663132 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-plugins-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663172 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjld\" (UniqueName: \"kubernetes.io/projected/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-kube-api-access-lkjld\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663218 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9xj\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-kube-api-access-cc9xj\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663259 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-srv-cert\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663290 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-mountpoint-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663310 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278hp\" (UniqueName: \"kubernetes.io/projected/81b9ec25-c107-4b28-9b74-bc36564db58c-kube-api-access-278hp\") pod \"ingress-canary-rtqxx\" (UID: \"81b9ec25-c107-4b28-9b74-bc36564db58c\") " pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.663326 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7de9fe51-8926-4966-85f1-b14c16db8a74-config-volume\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.663386 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.16336499 +0000 UTC m=+213.502694256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.664891 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.664932 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-socket-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665005 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b33493db-9d18-4fd8-936a-896a8cbc16c3-srv-cert\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665032 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665049 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-config\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665089 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca0fe033-a085-405b-a425-3ddc1f5e7e39-config-volume\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665128 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-registration-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665143 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf1ed808-7fba-44e8-9722-4b87c503ee9c-signing-key\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665215 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-bound-sa-token\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665230 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-csi-data-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665248 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665265 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g5mj\" (UniqueName: \"kubernetes.io/projected/ca0fe033-a085-405b-a425-3ddc1f5e7e39-kube-api-access-9g5mj\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665292 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf1ed808-7fba-44e8-9722-4b87c503ee9c-signing-cabundle\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665307 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c535fdcc-06de-4e87-a104-962f422a8df0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665332 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b33493db-9d18-4fd8-936a-896a8cbc16c3-profile-collector-cert\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665351 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665404 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8lm8\" (UniqueName: \"kubernetes.io/projected/c535fdcc-06de-4e87-a104-962f422a8df0-kube-api-access-q8lm8\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665442 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665475 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c535fdcc-06de-4e87-a104-962f422a8df0-trusted-ca\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665642 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543e751-687f-4a05-b3b0-22733a274f7f-config\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665766 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-certificates\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665805 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665820 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543e751-687f-4a05-b3b0-22733a274f7f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665840 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e5e4e49-7bad-4e99-a662-b0f4ca041477-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665935 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-certs\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665955 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd5hh\" (UniqueName: \"kubernetes.io/projected/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-kube-api-access-jd5hh\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.665970 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/45889c6c-eea4-447e-85a6-8045ba5a3fae-tmpfs\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.666002 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7181557-2eef-41be-9c83-ae2f0a1cfbfe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cqdrn\" (UID: \"f7181557-2eef-41be-9c83-ae2f0a1cfbfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.666308 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.166289587 +0000 UTC m=+213.505618853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.666739 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e5e4e49-7bad-4e99-a662-b0f4ca041477-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667462 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hx8f\" (UniqueName: \"kubernetes.io/projected/f7181557-2eef-41be-9c83-ae2f0a1cfbfe-kube-api-access-6hx8f\") pod \"multus-admission-controller-857f4d67dd-cqdrn\" (UID: \"f7181557-2eef-41be-9c83-ae2f0a1cfbfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667535 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbcd0737-551b-4d4b-bd69-ab6e324fe199-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6rz8s\" (UID: \"fbcd0737-551b-4d4b-bd69-ab6e324fe199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667588 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-tls\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667607 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ec90488-5451-4df4-b9d6-926ced491b80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667624 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77zhp\" (UniqueName: \"kubernetes.io/projected/7ec90488-5451-4df4-b9d6-926ced491b80-kube-api-access-77zhp\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667768 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gb8p\" (UniqueName: \"kubernetes.io/projected/0535917b-9b6d-486b-b932-964a18be9e51-kube-api-access-9gb8p\") pod \"downloads-7954f5f757-b4q5j\" (UID: \"0535917b-9b6d-486b-b932-964a18be9e51\") " pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667820 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h28h\" (UniqueName: \"kubernetes.io/projected/db82ca2b-5eac-4858-8808-7b6e22af0e26-kube-api-access-6h28h\") pod \"control-plane-machine-set-operator-78cbb6b69f-q9kk6\" (UID: \"db82ca2b-5eac-4858-8808-7b6e22af0e26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667838 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ec90488-5451-4df4-b9d6-926ced491b80-images\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667865 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6sz\" (UniqueName: \"kubernetes.io/projected/b33493db-9d18-4fd8-936a-896a8cbc16c3-kube-api-access-bc6sz\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.667880 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-proxy-tls\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.671987 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.674940 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-certificates\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.682867 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-trusted-ca\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.693769 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e5e4e49-7bad-4e99-a662-b0f4ca041477-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.698597 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.703973 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-tls\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.704919 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dmjfl"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.735618 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-bound-sa-token\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.739803 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.744992 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9xj\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-kube-api-access-cc9xj\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.770927 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.773957 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.273932866 +0000 UTC m=+213.613262131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.780524 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pchl5"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787578 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ec90488-5451-4df4-b9d6-926ced491b80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787609 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77zhp\" (UniqueName: \"kubernetes.io/projected/7ec90488-5451-4df4-b9d6-926ced491b80-kube-api-access-77zhp\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787633 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gb8p\" (UniqueName: \"kubernetes.io/projected/0535917b-9b6d-486b-b932-964a18be9e51-kube-api-access-9gb8p\") pod \"downloads-7954f5f757-b4q5j\" (UID: \"0535917b-9b6d-486b-b932-964a18be9e51\") " pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787656 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h28h\" (UniqueName: \"kubernetes.io/projected/db82ca2b-5eac-4858-8808-7b6e22af0e26-kube-api-access-6h28h\") pod \"control-plane-machine-set-operator-78cbb6b69f-q9kk6\" (UID: \"db82ca2b-5eac-4858-8808-7b6e22af0e26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787673 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ec90488-5451-4df4-b9d6-926ced491b80-images\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787690 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6sz\" (UniqueName: \"kubernetes.io/projected/b33493db-9d18-4fd8-936a-896a8cbc16c3-kube-api-access-bc6sz\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787709 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-proxy-tls\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787745 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c535fdcc-06de-4e87-a104-962f422a8df0-metrics-tls\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787764 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45889c6c-eea4-447e-85a6-8045ba5a3fae-apiservice-cert\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787782 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-node-bootstrap-token\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787798 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-serving-cert\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787814 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hg9k\" (UniqueName: \"kubernetes.io/projected/45889c6c-eea4-447e-85a6-8045ba5a3fae-kube-api-access-9hg9k\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787836 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjmvt\" (UniqueName: \"kubernetes.io/projected/7de9fe51-8926-4966-85f1-b14c16db8a74-kube-api-access-wjmvt\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787860 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7de9fe51-8926-4966-85f1-b14c16db8a74-secret-volume\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787879 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db82ca2b-5eac-4858-8808-7b6e22af0e26-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q9kk6\" (UID: \"db82ca2b-5eac-4858-8808-7b6e22af0e26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787907 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca0fe033-a085-405b-a425-3ddc1f5e7e39-metrics-tls\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787926 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcs5k\" (UniqueName: \"kubernetes.io/projected/b4bf205f-74bd-4aa7-879c-f034a0fd8465-kube-api-access-vcs5k\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787943 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787959 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ec90488-5451-4df4-b9d6-926ced491b80-proxy-tls\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787977 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e543e751-687f-4a05-b3b0-22733a274f7f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.787996 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh8tp\" (UniqueName: \"kubernetes.io/projected/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-kube-api-access-xh8tp\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788014 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm4x4\" (UniqueName: \"kubernetes.io/projected/fbcd0737-551b-4d4b-bd69-ab6e324fe199-kube-api-access-zm4x4\") pod \"package-server-manager-789f6589d5-6rz8s\" (UID: \"fbcd0737-551b-4d4b-bd69-ab6e324fe199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788042 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhj8\" (UniqueName: \"kubernetes.io/projected/cf1ed808-7fba-44e8-9722-4b87c503ee9c-kube-api-access-2dhj8\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788058 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81b9ec25-c107-4b28-9b74-bc36564db58c-cert\") pod \"ingress-canary-rtqxx\" (UID: \"81b9ec25-c107-4b28-9b74-bc36564db58c\") " pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788075 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm2v8\" (UniqueName: \"kubernetes.io/projected/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-kube-api-access-tm2v8\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788091 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfk4q\" (UniqueName: \"kubernetes.io/projected/32d57d80-6063-45cc-9e60-3f72f601cd83-kube-api-access-xfk4q\") pod \"migrator-59844c95c7-7jqjk\" (UID: \"32d57d80-6063-45cc-9e60-3f72f601cd83\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788110 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt4pr\" (UniqueName: \"kubernetes.io/projected/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-kube-api-access-xt4pr\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788151 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45889c6c-eea4-447e-85a6-8045ba5a3fae-webhook-cert\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788171 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6s7x\" (UniqueName: \"kubernetes.io/projected/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-kube-api-access-l6s7x\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788190 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-plugins-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788211 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjld\" (UniqueName: \"kubernetes.io/projected/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-kube-api-access-lkjld\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788232 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-srv-cert\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788249 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-278hp\" (UniqueName: \"kubernetes.io/projected/81b9ec25-c107-4b28-9b74-bc36564db58c-kube-api-access-278hp\") pod \"ingress-canary-rtqxx\" (UID: \"81b9ec25-c107-4b28-9b74-bc36564db58c\") " pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788268 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-mountpoint-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788291 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788309 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7de9fe51-8926-4966-85f1-b14c16db8a74-config-volume\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788327 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-socket-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788344 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788359 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-config\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788374 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b33493db-9d18-4fd8-936a-896a8cbc16c3-srv-cert\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788397 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca0fe033-a085-405b-a425-3ddc1f5e7e39-config-volume\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788411 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-registration-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788428 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-csi-data-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788442 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788458 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf1ed808-7fba-44e8-9722-4b87c503ee9c-signing-key\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788478 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g5mj\" (UniqueName: \"kubernetes.io/projected/ca0fe033-a085-405b-a425-3ddc1f5e7e39-kube-api-access-9g5mj\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788530 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf1ed808-7fba-44e8-9722-4b87c503ee9c-signing-cabundle\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788550 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b33493db-9d18-4fd8-936a-896a8cbc16c3-profile-collector-cert\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788569 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c535fdcc-06de-4e87-a104-962f422a8df0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788590 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8lm8\" (UniqueName: \"kubernetes.io/projected/c535fdcc-06de-4e87-a104-962f422a8df0-kube-api-access-q8lm8\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788606 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788624 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788643 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c535fdcc-06de-4e87-a104-962f422a8df0-trusted-ca\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788666 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543e751-687f-4a05-b3b0-22733a274f7f-config\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788685 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788700 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543e751-687f-4a05-b3b0-22733a274f7f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788722 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd5hh\" (UniqueName: \"kubernetes.io/projected/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-kube-api-access-jd5hh\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788737 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/45889c6c-eea4-447e-85a6-8045ba5a3fae-tmpfs\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788754 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7181557-2eef-41be-9c83-ae2f0a1cfbfe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cqdrn\" (UID: \"f7181557-2eef-41be-9c83-ae2f0a1cfbfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788769 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-certs\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788791 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hx8f\" (UniqueName: \"kubernetes.io/projected/f7181557-2eef-41be-9c83-ae2f0a1cfbfe-kube-api-access-6hx8f\") pod \"multus-admission-controller-857f4d67dd-cqdrn\" (UID: \"f7181557-2eef-41be-9c83-ae2f0a1cfbfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.789164 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbcd0737-551b-4d4b-bd69-ab6e324fe199-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6rz8s\" (UID: \"fbcd0737-551b-4d4b-bd69-ab6e324fe199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.791999 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7ec90488-5451-4df4-b9d6-926ced491b80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.794913 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-registration-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.797035 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.797411 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-csi-data-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.798058 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-plugins-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.799165 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543e751-687f-4a05-b3b0-22733a274f7f-config\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.808670 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.809825 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-mountpoint-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.810127 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.310114391 +0000 UTC m=+213.649443657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.810898 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/45889c6c-eea4-447e-85a6-8045ba5a3fae-webhook-cert\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.811534 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-srv-cert\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.788292 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7ec90488-5451-4df4-b9d6-926ced491b80-images\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.815392 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf205f-74bd-4aa7-879c-f034a0fd8465-socket-dir\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.828121 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.839233 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7de9fe51-8926-4966-85f1-b14c16db8a74-config-volume\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.839951 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ca0fe033-a085-405b-a425-3ddc1f5e7e39-metrics-tls\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.841822 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543e751-687f-4a05-b3b0-22733a274f7f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.842010 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c535fdcc-06de-4e87-a104-962f422a8df0-trusted-ca\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.842273 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/45889c6c-eea4-447e-85a6-8045ba5a3fae-apiservice-cert\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.843316 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-config\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.846178 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db82ca2b-5eac-4858-8808-7b6e22af0e26-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q9kk6\" (UID: \"db82ca2b-5eac-4858-8808-7b6e22af0e26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.851870 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.856262 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81b9ec25-c107-4b28-9b74-bc36564db58c-cert\") pod \"ingress-canary-rtqxx\" (UID: \"81b9ec25-c107-4b28-9b74-bc36564db58c\") " pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.856826 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.856830 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-serving-cert\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.871318 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ec90488-5451-4df4-b9d6-926ced491b80-proxy-tls\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.871859 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-proxy-tls\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.872896 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7181557-2eef-41be-9c83-ae2f0a1cfbfe-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cqdrn\" (UID: \"f7181557-2eef-41be-9c83-ae2f0a1cfbfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.872934 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7de9fe51-8926-4966-85f1-b14c16db8a74-secret-volume\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.873299 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf1ed808-7fba-44e8-9722-4b87c503ee9c-signing-key\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.873764 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbcd0737-551b-4d4b-bd69-ab6e324fe199-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6rz8s\" (UID: \"fbcd0737-551b-4d4b-bd69-ab6e324fe199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.873835 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.874529 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gb8p\" (UniqueName: \"kubernetes.io/projected/0535917b-9b6d-486b-b932-964a18be9e51-kube-api-access-9gb8p\") pod \"downloads-7954f5f757-b4q5j\" (UID: \"0535917b-9b6d-486b-b932-964a18be9e51\") " pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.874988 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.875609 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf1ed808-7fba-44e8-9722-4b87c503ee9c-signing-cabundle\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.876060 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhj8\" (UniqueName: \"kubernetes.io/projected/cf1ed808-7fba-44e8-9722-4b87c503ee9c-kube-api-access-2dhj8\") pod \"service-ca-9c57cc56f-ggnc8\" (UID: \"cf1ed808-7fba-44e8-9722-4b87c503ee9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.876923 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ktkzv"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.881072 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/45889c6c-eea4-447e-85a6-8045ba5a3fae-tmpfs\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.893459 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-node-bootstrap-token\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.894827 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.895614 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.395598249 +0000 UTC m=+213.734927506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.895813 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.896125 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.396117287 +0000 UTC m=+213.735446543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.897227 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h28h\" (UniqueName: \"kubernetes.io/projected/db82ca2b-5eac-4858-8808-7b6e22af0e26-kube-api-access-6h28h\") pod \"control-plane-machine-set-operator-78cbb6b69f-q9kk6\" (UID: \"db82ca2b-5eac-4858-8808-7b6e22af0e26\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.897744 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b33493db-9d18-4fd8-936a-896a8cbc16c3-srv-cert\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.898386 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-certs\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.898849 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" event={"ID":"4de0065f-e1a8-4dad-8c71-310890524a41","Type":"ContainerStarted","Data":"988b72a5b07ebab5b981541c37f65a4b3ff500c555ee0a9141ec07b848e49e6c"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.900949 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b33493db-9d18-4fd8-936a-896a8cbc16c3-profile-collector-cert\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.906097 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xc7q9"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.906141 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qnh2c"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.906154 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5"] Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.912062 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c535fdcc-06de-4e87-a104-962f422a8df0-metrics-tls\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.920165 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ca0fe033-a085-405b-a425-3ddc1f5e7e39-config-volume\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.920905 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" event={"ID":"fb155a06-d310-4f44-910d-96b02a5ed13b","Type":"ContainerStarted","Data":"63227b4af97b34739c2e35a6e034d132b403eff13e00b05a6bcb768f61c530bc"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.929681 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-278hp\" (UniqueName: \"kubernetes.io/projected/81b9ec25-c107-4b28-9b74-bc36564db58c-kube-api-access-278hp\") pod \"ingress-canary-rtqxx\" (UID: \"81b9ec25-c107-4b28-9b74-bc36564db58c\") " pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.930705 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" event={"ID":"96f8d72e-3f0a-41a6-813d-2538046dff59","Type":"ContainerStarted","Data":"dc57c2beee85dacfd4e3ddee8a4f319f3a7eef54d1042bd154a512f88dc04359"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.932941 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6sz\" (UniqueName: \"kubernetes.io/projected/b33493db-9d18-4fd8-936a-896a8cbc16c3-kube-api-access-bc6sz\") pod \"catalog-operator-68c6474976-s4szw\" (UID: \"b33493db-9d18-4fd8-936a-896a8cbc16c3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.936558 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" event={"ID":"919789ac-a13f-430c-a00c-5ab73f8e8cba","Type":"ContainerStarted","Data":"d0761d5522b0826edfff4027d21666f1181d0d9a00055424be2bde7e73397c74"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.941203 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm2v8\" (UniqueName: \"kubernetes.io/projected/0fde64fa-e1ac-49fc-a091-f88fb21b23b6-kube-api-access-tm2v8\") pod \"machine-config-server-hw7rq\" (UID: \"0fde64fa-e1ac-49fc-a091-f88fb21b23b6\") " pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.959817 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfk4q\" (UniqueName: \"kubernetes.io/projected/32d57d80-6063-45cc-9e60-3f72f601cd83-kube-api-access-xfk4q\") pod \"migrator-59844c95c7-7jqjk\" (UID: \"32d57d80-6063-45cc-9e60-3f72f601cd83\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.975536 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt4pr\" (UniqueName: \"kubernetes.io/projected/e3d070fb-17ac-47fb-aef0-c95a20e0e9eb-kube-api-access-xt4pr\") pod \"kube-storage-version-migrator-operator-b67b599dd-mh49g\" (UID: \"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.985570 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" event={"ID":"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e","Type":"ContainerStarted","Data":"7374ec316f508a0f40659efc95aa74fa858c166f0f5b22af99689d3a781e70bb"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.985604 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" event={"ID":"a8528c5d-e8a3-43e6-b17b-2adee2bcc66e","Type":"ContainerStarted","Data":"d8c0a958ec76818b1ef38112458b860ff6e22189fa4df46d4270219fa51705d5"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.996241 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.996523 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" event={"ID":"94e222b5-d52d-46c9-ac45-1a0158bdd383","Type":"ContainerStarted","Data":"fcb01a655dc31feeff0ec023e6d5a233c652ac6f9356503888fa290e4077a519"} Feb 23 06:44:20 crc kubenswrapper[4626]: I0223 06:44:20.996567 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" event={"ID":"94e222b5-d52d-46c9-ac45-1a0158bdd383","Type":"ContainerStarted","Data":"1a7653f8e45c486610ffeef572a09cba7d0f037f28321997ca8eb34b5afe9178"} Feb 23 06:44:20 crc kubenswrapper[4626]: E0223 06:44:20.996752 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.496735246 +0000 UTC m=+213.836064512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.000365 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.001986 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mpcw2"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.004259 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77zhp\" (UniqueName: \"kubernetes.io/projected/7ec90488-5451-4df4-b9d6-926ced491b80-kube-api-access-77zhp\") pod \"machine-config-operator-74547568cd-cczv4\" (UID: \"7ec90488-5451-4df4-b9d6-926ced491b80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.011077 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd5hh\" (UniqueName: \"kubernetes.io/projected/6e62bad6-3c66-473b-8b87-22bb9b1bdd33-kube-api-access-jd5hh\") pod \"olm-operator-6b444d44fb-fjblc\" (UID: \"6e62bad6-3c66-473b-8b87-22bb9b1bdd33\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.028335 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.035630 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.040136 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.041248 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6s7x\" (UniqueName: \"kubernetes.io/projected/2b1c4fc7-c55b-4360-ab3f-6a54be967dc8-kube-api-access-l6s7x\") pod \"machine-config-controller-84d6567774-wbvgp\" (UID: \"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.041412 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" event={"ID":"82862b64-d9a7-4970-b6d7-72afdf910e70","Type":"ContainerStarted","Data":"476766ac7dcca7a66a985426352883f80d56ae704aec7efd246e33124f68d948"} Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.041441 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" event={"ID":"82862b64-d9a7-4970-b6d7-72afdf910e70","Type":"ContainerStarted","Data":"f1fe349c9f9eaf151209cb5eff7a4361b52a0f6aeadefbe8ea642ef41cdcab2f"} Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.041452 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" event={"ID":"82862b64-d9a7-4970-b6d7-72afdf910e70","Type":"ContainerStarted","Data":"40d32ce55b241096d4f566bf2b69f8ed912136fd06afc58f47df41320220bf7d"} Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.046615 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.059891 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.060031 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.061063 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjld\" (UniqueName: \"kubernetes.io/projected/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-kube-api-access-lkjld\") pod \"marketplace-operator-79b997595-vs6nc\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.068485 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcs5k\" (UniqueName: \"kubernetes.io/projected/b4bf205f-74bd-4aa7-879c-f034a0fd8465-kube-api-access-vcs5k\") pod \"csi-hostpathplugin-fjnzs\" (UID: \"b4bf205f-74bd-4aa7-879c-f034a0fd8465\") " pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.072223 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.085844 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.097597 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g5mj\" (UniqueName: \"kubernetes.io/projected/ca0fe033-a085-405b-a425-3ddc1f5e7e39-kube-api-access-9g5mj\") pod \"dns-default-pj6n8\" (UID: \"ca0fe033-a085-405b-a425-3ddc1f5e7e39\") " pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.104446 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.105008 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.106047 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.606033702 +0000 UTC m=+213.945362957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.112957 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.113115 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c535fdcc-06de-4e87-a104-962f422a8df0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.118994 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hw7rq" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.129467 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rtqxx" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.132932 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8lm8\" (UniqueName: \"kubernetes.io/projected/c535fdcc-06de-4e87-a104-962f422a8df0-kube-api-access-q8lm8\") pod \"ingress-operator-5b745b69d9-n5sj6\" (UID: \"c535fdcc-06de-4e87-a104-962f422a8df0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.143925 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.145266 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k27xh"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.154798 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.178345 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hg9k\" (UniqueName: \"kubernetes.io/projected/45889c6c-eea4-447e-85a6-8045ba5a3fae-kube-api-access-9hg9k\") pod \"packageserver-d55dfcdfc-8nxxc\" (UID: \"45889c6c-eea4-447e-85a6-8045ba5a3fae\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.190066 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh8tp\" (UniqueName: \"kubernetes.io/projected/2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3-kube-api-access-xh8tp\") pod \"service-ca-operator-777779d784-2kn9c\" (UID: \"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.203883 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.206347 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.206459 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.706440713 +0000 UTC m=+214.045769980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.207412 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.208212 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.708201348 +0000 UTC m=+214.047530604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.258382 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjmvt\" (UniqueName: \"kubernetes.io/projected/7de9fe51-8926-4966-85f1-b14c16db8a74-kube-api-access-wjmvt\") pod \"collect-profiles-29530470-ksbmj\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.271144 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.283325 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e543e751-687f-4a05-b3b0-22733a274f7f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gvvts\" (UID: \"e543e751-687f-4a05-b3b0-22733a274f7f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.301473 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm4x4\" (UniqueName: \"kubernetes.io/projected/fbcd0737-551b-4d4b-bd69-ab6e324fe199-kube-api-access-zm4x4\") pod \"package-server-manager-789f6589d5-6rz8s\" (UID: \"fbcd0737-551b-4d4b-bd69-ab6e324fe199\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.302324 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hx8f\" (UniqueName: \"kubernetes.io/projected/f7181557-2eef-41be-9c83-ae2f0a1cfbfe-kube-api-access-6hx8f\") pod \"multus-admission-controller-857f4d67dd-cqdrn\" (UID: \"f7181557-2eef-41be-9c83-ae2f0a1cfbfe\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.312703 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.313448 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.314308 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.315120 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.815092099 +0000 UTC m=+214.154421365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.316126 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.316488 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.816474674 +0000 UTC m=+214.155803939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.320295 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.352978 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.366788 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.380132 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.387077 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.391175 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zc6wr"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.391282 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.404400 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.422169 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.422847 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:21.922830237 +0000 UTC m=+214.262159503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.489607 4626 csr.go:261] certificate signing request csr-mpvlm is approved, waiting to be issued Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.489677 4626 csr.go:257] certificate signing request csr-mpvlm is issued Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.524949 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.525825 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.025810694 +0000 UTC m=+214.365139950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.627683 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.628344 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.128321498 +0000 UTC m=+214.467650764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.628562 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.628903 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.128895449 +0000 UTC m=+214.468224714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.690712 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.729596 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.730116 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.230097678 +0000 UTC m=+214.569426944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.781249 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b4q5j"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.808709 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc"] Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.831176 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.831638 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.331618246 +0000 UTC m=+214.670947512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: I0223 06:44:21.932178 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:21 crc kubenswrapper[4626]: E0223 06:44:21.933712 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.433690273 +0000 UTC m=+214.773019529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:21 crc kubenswrapper[4626]: W0223 06:44:21.943713 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0535917b_9b6d_486b_b932_964a18be9e51.slice/crio-402a1bce961161fb250166a09de32d1e15a18ad4006dc328cca5225fe068150b WatchSource:0}: Error finding container 402a1bce961161fb250166a09de32d1e15a18ad4006dc328cca5225fe068150b: Status 404 returned error can't find the container with id 402a1bce961161fb250166a09de32d1e15a18ad4006dc328cca5225fe068150b Feb 23 06:44:21 crc kubenswrapper[4626]: W0223 06:44:21.961965 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e62bad6_3c66_473b_8b87_22bb9b1bdd33.slice/crio-e23cbfa191b246eb051a2d50efd615d5bf94a174f5156c00c2f0073b3df71cdc WatchSource:0}: Error finding container e23cbfa191b246eb051a2d50efd615d5bf94a174f5156c00c2f0073b3df71cdc: Status 404 returned error can't find the container with id e23cbfa191b246eb051a2d50efd615d5bf94a174f5156c00c2f0073b3df71cdc Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.038320 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.038950 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.538937029 +0000 UTC m=+214.878266284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.141110 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.141625 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.641607884 +0000 UTC m=+214.980937149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.144672 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.145335 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4bfst" event={"ID":"39684a26-7fad-4a8b-9621-99db77c9a01f","Type":"ContainerStarted","Data":"1c24566a7131c850de9b4d017414e038aa978fc3429bb884b7e9ae36c9be9e2c"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.145385 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4bfst" event={"ID":"39684a26-7fad-4a8b-9621-99db77c9a01f","Type":"ContainerStarted","Data":"7e55bcb6aa4b6eeadebec38bbb0d300df8303efe5dc0284d1ddc195d3357fa72"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.167679 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ktkzv" event={"ID":"4abfb5ed-4161-41d1-9cb5-70a93c60e109","Type":"ContainerStarted","Data":"8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.167715 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ktkzv" event={"ID":"4abfb5ed-4161-41d1-9cb5-70a93c60e109","Type":"ContainerStarted","Data":"0cf44727097895470c05496903ad94e3192a1acac67c4f089be04dad2cbafda6"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.193309 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.224667 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" event={"ID":"4de0065f-e1a8-4dad-8c71-310890524a41","Type":"ContainerStarted","Data":"b912509ea178c89372347273f07b70a3142eae25b3351bee73c6db5f69459cf6"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.241850 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" event={"ID":"15b076e5-da87-4a61-8782-f5f08732ece9","Type":"ContainerStarted","Data":"fccab294349e62fa386a4c439326d9a429f354269efce9f280063b9f67432acf"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.243062 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.244734 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.744721281 +0000 UTC m=+215.084050547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.261444 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5d8pq" podStartSLOduration=147.261429015 podStartE2EDuration="2m27.261429015s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.2539232 +0000 UTC m=+214.593252467" watchObservedRunningTime="2026-02-23 06:44:22.261429015 +0000 UTC m=+214.600758281" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.263338 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ggnc8"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.320478 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" event={"ID":"439f9126-bf49-4c43-aef4-c993cd5d818f","Type":"ContainerStarted","Data":"193ea58ccbfca38cff363df6b06d01874bda6a03c0f89fae46c6980389ce3470"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.343983 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.344541 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.844519767 +0000 UTC m=+215.183849033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.391565 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ljz6r" podStartSLOduration=147.391543079 podStartE2EDuration="2m27.391543079s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.348655786 +0000 UTC m=+214.687985042" watchObservedRunningTime="2026-02-23 06:44:22.391543079 +0000 UTC m=+214.730872345" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.428464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" event={"ID":"f9674062-fec2-47d4-af0e-a7c268402ee2","Type":"ContainerStarted","Data":"75e8a54b7838f12c9890ef6327185812bee005f8d7e862a925f70d4b415043d7"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.428572 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" event={"ID":"f9674062-fec2-47d4-af0e-a7c268402ee2","Type":"ContainerStarted","Data":"b0259e2d3ef0775564106211ad959fc33526b6232d4a42bdbddf585b6d321bc6"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.463642 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.468809 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:22.968790868 +0000 UTC m=+215.308120123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.491591 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-23 06:39:21 +0000 UTC, rotation deadline is 2026-11-25 05:55:46.789784012 +0000 UTC Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.491625 4626 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6599h11m24.29816236s for next certificate rotation Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.523054 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vs6nc"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.579771 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.582566 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.082542039 +0000 UTC m=+215.421871294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.599213 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.614876 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" event={"ID":"6e62bad6-3c66-473b-8b87-22bb9b1bdd33","Type":"ContainerStarted","Data":"e23cbfa191b246eb051a2d50efd615d5bf94a174f5156c00c2f0073b3df71cdc"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.643826 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b4q5j" event={"ID":"0535917b-9b6d-486b-b932-964a18be9e51","Type":"ContainerStarted","Data":"402a1bce961161fb250166a09de32d1e15a18ad4006dc328cca5225fe068150b"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.656789 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" podStartSLOduration=146.656771443 podStartE2EDuration="2m26.656771443s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.652774797 +0000 UTC m=+214.992104064" watchObservedRunningTime="2026-02-23 06:44:22.656771443 +0000 UTC m=+214.996100700" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.659453 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pj6n8"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.691006 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.691537 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.191524549 +0000 UTC m=+215.530853815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.702217 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cqdrn"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.703586 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" event={"ID":"f58bc597-63a9-4742-9985-236825024833","Type":"ContainerStarted","Data":"c46c67eac94c9cc7d91a215520d391ac8fe8f6dc811d5246523b3c3bd0094c08"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.705893 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.709174 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:22 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:22 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:22 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.709213 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.742610 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hw7rq" event={"ID":"0fde64fa-e1ac-49fc-a091-f88fb21b23b6","Type":"ContainerStarted","Data":"0966023b7f3f34898514061228277539479be32485367caeddc57c9acd5fa021"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.768315 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" podStartSLOduration=147.768291613 podStartE2EDuration="2m27.768291613s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.768272688 +0000 UTC m=+215.107601954" watchObservedRunningTime="2026-02-23 06:44:22.768291613 +0000 UTC m=+215.107620879" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.792701 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.793238 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.29322149 +0000 UTC m=+215.632550756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.805595 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" event={"ID":"96f8d72e-3f0a-41a6-813d-2538046dff59","Type":"ContainerStarted","Data":"546d84d6627a6d6301585afea6005a0906109d222c06d92b8761d1e1d5234ffc"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.825241 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-4bfst" podStartSLOduration=147.825223023 podStartE2EDuration="2m27.825223023s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.813755118 +0000 UTC m=+215.153084383" watchObservedRunningTime="2026-02-23 06:44:22.825223023 +0000 UTC m=+215.164552299" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.832982 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" event={"ID":"da93645e-0946-47f3-8edd-933148cbc8d2","Type":"ContainerStarted","Data":"7c6cbb4e8d69db1cb52f1484bf53ef9cefe8e118e7a540c5aab79e10d48d5ff4"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.854400 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" event={"ID":"919789ac-a13f-430c-a00c-5ab73f8e8cba","Type":"ContainerStarted","Data":"cd4100468081553e98efd6b7d5b6d91f5dcb7145f256dcb4d18540b438dbc450"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.855363 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.861466 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ktkzv" podStartSLOduration=147.861454322 podStartE2EDuration="2m27.861454322s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.852178144 +0000 UTC m=+215.191507400" watchObservedRunningTime="2026-02-23 06:44:22.861454322 +0000 UTC m=+215.200783589" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.864188 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.868992 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" event={"ID":"7728d23e-9cc9-424f-8773-a55ba1e7b940","Type":"ContainerStarted","Data":"26cff5547e573adbafb93a5d7aeaaecada3e76be3017322ed705cce61e9c5cea"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.886016 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.904787 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:22 crc kubenswrapper[4626]: E0223 06:44:22.922115 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.422100959 +0000 UTC m=+215.761430225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.926870 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zc6wr"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.937402 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dz677" podStartSLOduration=147.937383138 podStartE2EDuration="2m27.937383138s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.911388296 +0000 UTC m=+215.250717561" watchObservedRunningTime="2026-02-23 06:44:22.937383138 +0000 UTC m=+215.276712404" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.966669 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" event={"ID":"878646c5-5965-4cb2-909b-14737533f2a1","Type":"ContainerStarted","Data":"52fb46567ed385b0966cdc72a45a958068787dcaf416a6e129ca538e4c9b3d91"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.981445 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r5txd" podStartSLOduration=147.981422141 podStartE2EDuration="2m27.981422141s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:22.96569887 +0000 UTC m=+215.305028137" watchObservedRunningTime="2026-02-23 06:44:22.981422141 +0000 UTC m=+215.320751406" Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.996487 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj"] Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.998799 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" event={"ID":"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8","Type":"ContainerDied","Data":"34ab02a327877782d8696dad897f95308bd3069ced4de9cfe2ca294c75a72b78"} Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.997825 4626 generic.go:334] "Generic (PLEG): container finished" podID="03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8" containerID="34ab02a327877782d8696dad897f95308bd3069ced4de9cfe2ca294c75a72b78" exitCode=0 Feb 23 06:44:22 crc kubenswrapper[4626]: I0223 06:44:22.998892 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" event={"ID":"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8","Type":"ContainerStarted","Data":"3c352d135e13cb1e30d26d9a5611f49dacfc123f4b9c7d6bf3e47687e2b39fd4"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.016036 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.016535 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.016625 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.016675 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.016765 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.017144 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.517128611 +0000 UTC m=+215.856457877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.017927 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pchl5" event={"ID":"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a","Type":"ContainerStarted","Data":"3b0033f8244842ddab8d65c17c8ade9b171e71f6e5a30038b8097b7d94673ab5"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.017960 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pchl5" event={"ID":"0b4f1a6f-9b7d-4a57-8494-e23b34eb2e5a","Type":"ContainerStarted","Data":"1c1757a3cba46650470de350b979fa3881095b7cc280683e8b7fb134ff1907e5"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.018918 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.024856 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.032217 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.032598 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.036633 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.052997 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rtqxx"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.075315 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" podStartSLOduration=148.075298483 podStartE2EDuration="2m28.075298483s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.037116601 +0000 UTC m=+215.376445867" watchObservedRunningTime="2026-02-23 06:44:23.075298483 +0000 UTC m=+215.414627749" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.082562 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" event={"ID":"77d47a11-2a3a-4803-8f4b-3bfe07c27e00","Type":"ContainerStarted","Data":"27515f0a39f88b47e5e04bb7cea835776dc5a9b7ee985d7e74e5d7bdc83c1d8a"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.092962 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.095447 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.102544 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.113646 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.131331 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.133667 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.633650259 +0000 UTC m=+215.972979525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.139223 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" event={"ID":"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb","Type":"ContainerStarted","Data":"83d60e392f926fe6c1ae4017cf27a7a3a9908a19d6444feef36178c635c08209"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.151602 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" podStartSLOduration=148.151585483 podStartE2EDuration="2m28.151585483s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.147965526 +0000 UTC m=+215.487294792" watchObservedRunningTime="2026-02-23 06:44:23.151585483 +0000 UTC m=+215.490914749" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.173290 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" event={"ID":"fb155a06-d310-4f44-910d-96b02a5ed13b","Type":"ContainerStarted","Data":"43ccf3e613bcdaf243361b8d8ac0896483646dd6072bebbfc4bcc02f9c3d0b41"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.176334 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.221369 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" event={"ID":"1957912e-d933-428f-98b3-65bb43ca2ad0","Type":"ContainerStarted","Data":"97719896d6df9d9eade34dd88418c10f084a3776b6fb2833de2c9a4087f8647b"} Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.221407 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.223162 4626 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zc6wr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.223227 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" podUID="1957912e-d933-428f-98b3-65bb43ca2ad0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.227095 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pchl5" podStartSLOduration=148.227078297 podStartE2EDuration="2m28.227078297s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.222032385 +0000 UTC m=+215.561361651" watchObservedRunningTime="2026-02-23 06:44:23.227078297 +0000 UTC m=+215.566407564" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.232107 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.233406 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.733390604 +0000 UTC m=+216.072719871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.321831 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vd8n9" podStartSLOduration=148.321811024 podStartE2EDuration="2m28.321811024s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.271767454 +0000 UTC m=+215.611096710" watchObservedRunningTime="2026-02-23 06:44:23.321811024 +0000 UTC m=+215.661140289" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.322641 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fjnzs"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.337558 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.339648 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.839631091 +0000 UTC m=+216.178960347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.440079 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.440343 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.940315676 +0000 UTC m=+216.279644942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.440643 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.441077 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:23.941063123 +0000 UTC m=+216.280392390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.454546 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.454606 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.457908 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.462347 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" podStartSLOduration=148.46232885 podStartE2EDuration="2m28.46232885s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.457565259 +0000 UTC m=+215.796894525" watchObservedRunningTime="2026-02-23 06:44:23.46232885 +0000 UTC m=+215.801658116" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.463165 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.482264 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw"] Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.509663 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" podStartSLOduration=147.509645675 podStartE2EDuration="2m27.509645675s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.508195775 +0000 UTC m=+215.847525040" watchObservedRunningTime="2026-02-23 06:44:23.509645675 +0000 UTC m=+215.848974941" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.542226 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.542708 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.042679192 +0000 UTC m=+216.382008457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.645399 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.652355 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.15233514 +0000 UTC m=+216.491664406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.664203 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pchl5" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.695583 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" podStartSLOduration=147.695561632 podStartE2EDuration="2m27.695561632s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:23.554768377 +0000 UTC m=+215.894097643" watchObservedRunningTime="2026-02-23 06:44:23.695561632 +0000 UTC m=+216.034890898" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.707831 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:23 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:23 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:23 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.708073 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.772255 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.772813 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.27279295 +0000 UTC m=+216.612122216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.873992 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.874406 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.374393019 +0000 UTC m=+216.713722284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:23 crc kubenswrapper[4626]: I0223 06:44:23.980109 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:23 crc kubenswrapper[4626]: E0223 06:44:23.980795 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.480775903 +0000 UTC m=+216.820105169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.086322 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.087240 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.587226506 +0000 UTC m=+216.926555762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.190833 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.191588 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.691558348 +0000 UTC m=+217.030887614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.239977 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" event={"ID":"4de0065f-e1a8-4dad-8c71-310890524a41","Type":"ContainerStarted","Data":"a28de5c544ddc7719446014bf813ac611f03eb8fa68f5a00de845ef45f9edb29"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.242593 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" event={"ID":"77d47a11-2a3a-4803-8f4b-3bfe07c27e00","Type":"ContainerStarted","Data":"3c6b8c20cfbfac31f52bf89485972d69b8c4cd591d94cfa1eb89ca0682728147"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.245210 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qnh2c" event={"ID":"77d47a11-2a3a-4803-8f4b-3bfe07c27e00","Type":"ContainerStarted","Data":"d079fd8654f8d672f42d26c7f85ccf728cba1fc87e136b1f77a5183121239a84"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.282122 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-dmjfl" podStartSLOduration=149.282101194 podStartE2EDuration="2m29.282101194s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.280906544 +0000 UTC m=+216.620235810" watchObservedRunningTime="2026-02-23 06:44:24.282101194 +0000 UTC m=+216.621430459" Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.294763 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.794751346 +0000 UTC m=+217.134080612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.294486 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.372832 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" event={"ID":"7de9fe51-8926-4966-85f1-b14c16db8a74","Type":"ContainerStarted","Data":"86a087b59bd61ad4432a7d8990523be0a09a3135643c8e4a561d95d22cafe9ab"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.372878 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" event={"ID":"7de9fe51-8926-4966-85f1-b14c16db8a74","Type":"ContainerStarted","Data":"46bde7187f57119acc1f5d9bc4a590721e3476dea682619655290a0baafaa3c9"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.392045 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hw7rq" event={"ID":"0fde64fa-e1ac-49fc-a091-f88fb21b23b6","Type":"ContainerStarted","Data":"b41cff5b810ff7ccbecb4306321342a4df9cc3778cf5c814c738288a55d71951"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.396763 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.397122 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:24.897106857 +0000 UTC m=+217.236436124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.422998 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" podStartSLOduration=149.422976553 podStartE2EDuration="2m29.422976553s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.411824111 +0000 UTC m=+216.751153377" watchObservedRunningTime="2026-02-23 06:44:24.422976553 +0000 UTC m=+216.762305820" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.425330 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"01f2fb266440d58016dcc1e5775aac473badb4e1006bbd6b6fb9a5d085716b88"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.427558 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pj6n8" event={"ID":"ca0fe033-a085-405b-a425-3ddc1f5e7e39","Type":"ContainerStarted","Data":"4ede0a7fb23ee3e660321e2e210015c955b406971d4e632bad46c01aa511d5a4"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.427588 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pj6n8" event={"ID":"ca0fe033-a085-405b-a425-3ddc1f5e7e39","Type":"ContainerStarted","Data":"70cc54288b4e56532825a2a8497168e9acf6032674c2410129757b7d2ffbe7ab"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.428947 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" event={"ID":"32d57d80-6063-45cc-9e60-3f72f601cd83","Type":"ContainerStarted","Data":"7607ed651087472cc0009c79b37f2c03f94035aade6771916e6bdca3e9933537"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.428976 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" event={"ID":"32d57d80-6063-45cc-9e60-3f72f601cd83","Type":"ContainerStarted","Data":"753de0d83204330ad79b9a0936a77883e65208567374db7d8814bd109cb01680"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.428988 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" event={"ID":"32d57d80-6063-45cc-9e60-3f72f601cd83","Type":"ContainerStarted","Data":"7103d53214d69e1884703e360cbc909191b3b479f28aa16b43ab9e07b2faa02b"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.439681 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hw7rq" podStartSLOduration=6.43966492 podStartE2EDuration="6.43966492s" podCreationTimestamp="2026-02-23 06:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.429961697 +0000 UTC m=+216.769290964" watchObservedRunningTime="2026-02-23 06:44:24.43966492 +0000 UTC m=+216.778994186" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.461564 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7jqjk" podStartSLOduration=148.461544573 podStartE2EDuration="2m28.461544573s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.459329892 +0000 UTC m=+216.798659158" watchObservedRunningTime="2026-02-23 06:44:24.461544573 +0000 UTC m=+216.800873839" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.463680 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" event={"ID":"f7181557-2eef-41be-9c83-ae2f0a1cfbfe","Type":"ContainerStarted","Data":"17281fd4c45a70fe1228585a18ba66e5c36b455399ddd6ce9d1fc000a5584a26"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.463734 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" event={"ID":"f7181557-2eef-41be-9c83-ae2f0a1cfbfe","Type":"ContainerStarted","Data":"86857cef2e3493e4a1923e426935b9d24deee56d5f6fbb381e1d2208ee949a91"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.479526 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" event={"ID":"7ec90488-5451-4df4-b9d6-926ced491b80","Type":"ContainerStarted","Data":"84dbf122d2c62f7c43ab49cbc9489f6a2a8e08fdb6275c43a255326c4cf5795a"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.495702 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" event={"ID":"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3","Type":"ContainerStarted","Data":"bfba9c076d80a5b17ce77bf9dc938ee511dd9de045811cd3e7a6df6c3d6e3093"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.498671 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.500807 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.000792913 +0000 UTC m=+217.340122179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.502865 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rtqxx" event={"ID":"81b9ec25-c107-4b28-9b74-bc36564db58c","Type":"ContainerStarted","Data":"c9ec07721b72fb4599e7c11f2d02ab64cba54f7d53f095849f8a17e196235aa1"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.520918 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" event={"ID":"15b076e5-da87-4a61-8782-f5f08732ece9","Type":"ContainerStarted","Data":"baa819fed2f25f45d3bdef3545f0cd2253a42f60e5836ab60ea2aab0c3b82918"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.531720 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rtqxx" podStartSLOduration=6.531704845 podStartE2EDuration="6.531704845s" podCreationTimestamp="2026-02-23 06:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.526101152 +0000 UTC m=+216.865430418" watchObservedRunningTime="2026-02-23 06:44:24.531704845 +0000 UTC m=+216.871034112" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.547793 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" event={"ID":"6e62bad6-3c66-473b-8b87-22bb9b1bdd33","Type":"ContainerStarted","Data":"e9565cd08a660f523045d055f0c0a1d92b6c10c8e726a9714881a0c74d331c80"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.548667 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.561909 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" event={"ID":"b33493db-9d18-4fd8-936a-896a8cbc16c3","Type":"ContainerStarted","Data":"609ebccada3cd6f602aa5380542e96c8d749ade5ed0ef614e966865de48d8e5d"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.586741 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.590850 4626 generic.go:334] "Generic (PLEG): container finished" podID="7728d23e-9cc9-424f-8773-a55ba1e7b940" containerID="f6930f7a424349bb7f8af00a79071a987ec9d452365541ed6dc9e9d3284bc1c6" exitCode=0 Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.590956 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" event={"ID":"7728d23e-9cc9-424f-8773-a55ba1e7b940","Type":"ContainerDied","Data":"f6930f7a424349bb7f8af00a79071a987ec9d452365541ed6dc9e9d3284bc1c6"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.594614 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.599619 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mr26t" podStartSLOduration=148.599602887 podStartE2EDuration="2m28.599602887s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.549316671 +0000 UTC m=+216.888645937" watchObservedRunningTime="2026-02-23 06:44:24.599602887 +0000 UTC m=+216.938932144" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.600599 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.601474 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fjblc" podStartSLOduration=148.60146289 podStartE2EDuration="2m28.60146289s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.600093582 +0000 UTC m=+216.939422848" watchObservedRunningTime="2026-02-23 06:44:24.60146289 +0000 UTC m=+216.940792156" Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.606517 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.106476081 +0000 UTC m=+217.445805346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.610724 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" event={"ID":"db82ca2b-5eac-4858-8808-7b6e22af0e26","Type":"ContainerStarted","Data":"8a253e81aa9c251efe120d9ad7eaab77af19733f90b9532b4434efe3d085209d"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.633025 4626 generic.go:334] "Generic (PLEG): container finished" podID="439f9126-bf49-4c43-aef4-c993cd5d818f" containerID="abc0dfd2f3805182b5705fb15f91902736519caa2290be0a9ed6b5b7f0702af0" exitCode=0 Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.633310 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" event={"ID":"439f9126-bf49-4c43-aef4-c993cd5d818f","Type":"ContainerDied","Data":"abc0dfd2f3805182b5705fb15f91902736519caa2290be0a9ed6b5b7f0702af0"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.647649 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" event={"ID":"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8","Type":"ContainerStarted","Data":"7f0b74a794f62d8c084718c75f76458f9112d31bfea6add536d2e7d781313448"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.647696 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" event={"ID":"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8","Type":"ContainerStarted","Data":"dbce818f4b72485b8da80f9c772119b9f5675f777e4a400afa2a6d166eb10cf9"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.651485 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" podStartSLOduration=149.651472616 podStartE2EDuration="2m29.651472616s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.62325051 +0000 UTC m=+216.962579775" watchObservedRunningTime="2026-02-23 06:44:24.651472616 +0000 UTC m=+216.990801882" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.691833 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" event={"ID":"c535fdcc-06de-4e87-a104-962f422a8df0","Type":"ContainerStarted","Data":"794f6a071d03ff92b430548d783821e57207571e6a30fcb881eb636d8a1c36bc"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.706486 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.706954 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.206937625 +0000 UTC m=+217.546266890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.713865 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:24 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:24 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:24 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.713904 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.730722 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" event={"ID":"da93645e-0946-47f3-8edd-933148cbc8d2","Type":"ContainerStarted","Data":"260cc623f71850f5dee509f824118bb4e8a88f97dad95ec3168b3ec8daa09d93"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.730763 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" event={"ID":"da93645e-0946-47f3-8edd-933148cbc8d2","Type":"ContainerStarted","Data":"8f36001f8c03e6d7155ba249771cc519a176821e02ecfbe61c5e37040c25a842"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.808301 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.809431 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.309411789 +0000 UTC m=+217.648741055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.814744 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" event={"ID":"fbcd0737-551b-4d4b-bd69-ab6e324fe199","Type":"ContainerStarted","Data":"77833b8cfe5d04bcaa0e9940c90a03ab97bb3760612119c836826d4e3128a871"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.843755 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" event={"ID":"b4bf205f-74bd-4aa7-879c-f034a0fd8465","Type":"ContainerStarted","Data":"938aed9587c81a860f0530169c0579e35d36449acf3b2ca11a3a47971ac04f6d"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.856726 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b4q5j" event={"ID":"0535917b-9b6d-486b-b932-964a18be9e51","Type":"ContainerStarted","Data":"50339c7f8d0546cbb268f858cb71b398918d314b3d7474d77f3fabcf4870c8f4"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.857481 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.868534 4626 patch_prober.go:28] interesting pod/downloads-7954f5f757-b4q5j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.868583 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b4q5j" podUID="0535917b-9b6d-486b-b932-964a18be9e51" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.889813 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mh49g" event={"ID":"e3d070fb-17ac-47fb-aef0-c95a20e0e9eb","Type":"ContainerStarted","Data":"9cf9b4f6d8b72b7ef92b33c7318447a11b9b287c4538cf98404f30a67de292dd"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.911722 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:24 crc kubenswrapper[4626]: E0223 06:44:24.912082 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.412070041 +0000 UTC m=+217.751399306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.934001 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" event={"ID":"1957912e-d933-428f-98b3-65bb43ca2ad0","Type":"ContainerStarted","Data":"ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1"} Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.934224 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" podUID="1957912e-d933-428f-98b3-65bb43ca2ad0" containerName="controller-manager" containerID="cri-o://ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1" gracePeriod=30 Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.942804 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.989348 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c8plh" podStartSLOduration=149.989329902 podStartE2EDuration="2m29.989329902s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:24.987885151 +0000 UTC m=+217.327214418" watchObservedRunningTime="2026-02-23 06:44:24.989329902 +0000 UTC m=+217.328659168" Feb 23 06:44:24 crc kubenswrapper[4626]: I0223 06:44:24.997777 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-848ld" event={"ID":"f58bc597-63a9-4742-9985-236825024833","Type":"ContainerStarted","Data":"3180849d0600d31319f630c50d3fa84dc2e8392bb9d4754c7469c630c949ba9c"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.011237 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" event={"ID":"c98d4d0e-9a53-4b64-a26d-11eda45f90fa","Type":"ContainerStarted","Data":"a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.011267 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" event={"ID":"c98d4d0e-9a53-4b64-a26d-11eda45f90fa","Type":"ContainerStarted","Data":"f4599bf751e7ffc79bd41460ce99bcf559107b050fc8a56fb5f65543c0104c46"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.011677 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.012209 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.013260 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.513239237 +0000 UTC m=+217.852568503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.036812 4626 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vs6nc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.036863 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.039965 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" event={"ID":"cf1ed808-7fba-44e8-9722-4b87c503ee9c","Type":"ContainerStarted","Data":"c1956f6af9c0e97ffa0fab60c051b98bdbbec32c3d2648d2528d374f5b2f3d0e"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.040040 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" event={"ID":"cf1ed808-7fba-44e8-9722-4b87c503ee9c","Type":"ContainerStarted","Data":"282d92c69938aa23b4f97f933118136809c17cf3a6c45d437e52dec3f81b82c4"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.068090 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xc7q9" event={"ID":"878646c5-5965-4cb2-909b-14737533f2a1","Type":"ContainerStarted","Data":"967c257092ba25a887d5ac62a98feed2a82c338ca5c778362766787ad147d621"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.103885 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-b4q5j" podStartSLOduration=150.103864728 podStartE2EDuration="2m30.103864728s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:25.103072877 +0000 UTC m=+217.442402134" watchObservedRunningTime="2026-02-23 06:44:25.103864728 +0000 UTC m=+217.443193995" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.104106 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" event={"ID":"e543e751-687f-4a05-b3b0-22733a274f7f","Type":"ContainerStarted","Data":"d2235331fae954bd892ff5f302a9834cebc1e63df979251d93970bbae4073d39"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.104138 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" event={"ID":"e543e751-687f-4a05-b3b0-22733a274f7f","Type":"ContainerStarted","Data":"d768301465a2262cc566bc6fb2e61da73e63e5f0c7135272b2cdb107f25ad68e"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.114337 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.116768 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.616752477 +0000 UTC m=+217.956081744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.147182 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" event={"ID":"45889c6c-eea4-447e-85a6-8045ba5a3fae","Type":"ContainerStarted","Data":"dd5bb30eac30eb3a6a6ffa2ca557700e9edb4dac39c60132a7510475b119e598"} Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.147713 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" podUID="94e222b5-d52d-46c9-ac45-1a0158bdd383" containerName="route-controller-manager" containerID="cri-o://fcb01a655dc31feeff0ec023e6d5a233c652ac6f9356503888fa290e4077a519" gracePeriod=30 Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.147878 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.157634 4626 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8nxxc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.157672 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" podUID="45889c6c-eea4-447e-85a6-8045ba5a3fae" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.157803 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" podStartSLOduration=149.157779448 podStartE2EDuration="2m29.157779448s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:25.157452602 +0000 UTC m=+217.496781868" watchObservedRunningTime="2026-02-23 06:44:25.157779448 +0000 UTC m=+217.497108715" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.215946 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.217071 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.717055063 +0000 UTC m=+218.056384329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.318164 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.321490 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.821476604 +0000 UTC m=+218.160805871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.398774 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ggnc8" podStartSLOduration=149.398755402 podStartE2EDuration="2m29.398755402s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:25.329097536 +0000 UTC m=+217.668426792" watchObservedRunningTime="2026-02-23 06:44:25.398755402 +0000 UTC m=+217.738084658" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.419810 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.420204 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:25.920187031 +0000 UTC m=+218.259516297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.474970 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gvvts" podStartSLOduration=150.474947443 podStartE2EDuration="2m30.474947443s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:25.399353578 +0000 UTC m=+217.738682845" watchObservedRunningTime="2026-02-23 06:44:25.474947443 +0000 UTC m=+217.814276709" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.475742 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" podStartSLOduration=149.475734385 podStartE2EDuration="2m29.475734385s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:25.474220064 +0000 UTC m=+217.813549330" watchObservedRunningTime="2026-02-23 06:44:25.475734385 +0000 UTC m=+217.815063650" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.524206 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.524653 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.024639781 +0000 UTC m=+218.363969048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.628995 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.629467 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.129452048 +0000 UTC m=+218.468781314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.685262 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.685321 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.709772 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:25 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:25 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:25 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.709808 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.731247 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.731686 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.231667106 +0000 UTC m=+218.570996371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.832876 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.833569 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.333549073 +0000 UTC m=+218.672878340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.852259 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.916451 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9697bfb97-sfbbg"] Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.922089 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1957912e-d933-428f-98b3-65bb43ca2ad0" containerName="controller-manager" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.922112 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1957912e-d933-428f-98b3-65bb43ca2ad0" containerName="controller-manager" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.923049 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1957912e-d933-428f-98b3-65bb43ca2ad0" containerName="controller-manager" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.927535 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.938001 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-config\") pod \"1957912e-d933-428f-98b3-65bb43ca2ad0\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.938079 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp9c2\" (UniqueName: \"kubernetes.io/projected/1957912e-d933-428f-98b3-65bb43ca2ad0-kube-api-access-vp9c2\") pod \"1957912e-d933-428f-98b3-65bb43ca2ad0\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.938124 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-client-ca\") pod \"1957912e-d933-428f-98b3-65bb43ca2ad0\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.938164 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1957912e-d933-428f-98b3-65bb43ca2ad0-serving-cert\") pod \"1957912e-d933-428f-98b3-65bb43ca2ad0\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.938379 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-proxy-ca-bundles\") pod \"1957912e-d933-428f-98b3-65bb43ca2ad0\" (UID: \"1957912e-d933-428f-98b3-65bb43ca2ad0\") " Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.938746 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:25 crc kubenswrapper[4626]: E0223 06:44:25.939252 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.439238582 +0000 UTC m=+218.778567849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.939844 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-config" (OuterVolumeSpecName: "config") pod "1957912e-d933-428f-98b3-65bb43ca2ad0" (UID: "1957912e-d933-428f-98b3-65bb43ca2ad0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.940174 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1957912e-d933-428f-98b3-65bb43ca2ad0" (UID: "1957912e-d933-428f-98b3-65bb43ca2ad0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.942645 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-client-ca" (OuterVolumeSpecName: "client-ca") pod "1957912e-d933-428f-98b3-65bb43ca2ad0" (UID: "1957912e-d933-428f-98b3-65bb43ca2ad0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.960217 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1957912e-d933-428f-98b3-65bb43ca2ad0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1957912e-d933-428f-98b3-65bb43ca2ad0" (UID: "1957912e-d933-428f-98b3-65bb43ca2ad0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:44:25 crc kubenswrapper[4626]: I0223 06:44:25.972286 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1957912e-d933-428f-98b3-65bb43ca2ad0-kube-api-access-vp9c2" (OuterVolumeSpecName: "kube-api-access-vp9c2") pod "1957912e-d933-428f-98b3-65bb43ca2ad0" (UID: "1957912e-d933-428f-98b3-65bb43ca2ad0"). InnerVolumeSpecName "kube-api-access-vp9c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.003670 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9697bfb97-sfbbg"] Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040272 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.040531 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.540505635 +0000 UTC m=+218.879834901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040596 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040700 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870cbdb2-daed-4802-a6fb-77d817f68d07-serving-cert\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040740 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-config\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040760 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-client-ca\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040791 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-proxy-ca-bundles\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040822 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq5lq\" (UniqueName: \"kubernetes.io/projected/870cbdb2-daed-4802-a6fb-77d817f68d07-kube-api-access-jq5lq\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040870 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1957912e-d933-428f-98b3-65bb43ca2ad0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040882 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040892 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040915 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp9c2\" (UniqueName: \"kubernetes.io/projected/1957912e-d933-428f-98b3-65bb43ca2ad0-kube-api-access-vp9c2\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.040924 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1957912e-d933-428f-98b3-65bb43ca2ad0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.041188 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.541174865 +0000 UTC m=+218.880504131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.119049 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.135400 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.136158 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.138056 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.139354 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.141476 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.141791 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-config\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.141827 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-client-ca\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.141864 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-proxy-ca-bundles\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.141900 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq5lq\" (UniqueName: \"kubernetes.io/projected/870cbdb2-daed-4802-a6fb-77d817f68d07-kube-api-access-jq5lq\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.142004 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870cbdb2-daed-4802-a6fb-77d817f68d07-serving-cert\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.142374 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.64234876 +0000 UTC m=+218.981678026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.143436 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-config\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.145052 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-proxy-ca-bundles\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.148122 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-client-ca\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.160137 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870cbdb2-daed-4802-a6fb-77d817f68d07-serving-cert\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.178624 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq5lq\" (UniqueName: \"kubernetes.io/projected/870cbdb2-daed-4802-a6fb-77d817f68d07-kube-api-access-jq5lq\") pod \"controller-manager-9697bfb97-sfbbg\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.187559 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.187751 4626 generic.go:334] "Generic (PLEG): container finished" podID="1957912e-d933-428f-98b3-65bb43ca2ad0" containerID="ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1" exitCode=0 Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.187825 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" event={"ID":"1957912e-d933-428f-98b3-65bb43ca2ad0","Type":"ContainerDied","Data":"ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.187853 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" event={"ID":"1957912e-d933-428f-98b3-65bb43ca2ad0","Type":"ContainerDied","Data":"97719896d6df9d9eade34dd88418c10f084a3776b6fb2833de2c9a4087f8647b"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.187872 4626 scope.go:117] "RemoveContainer" containerID="ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.188014 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zc6wr" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.198825 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cd2f9be3ec63cdc79eb5ecfbf279113151431961f3a1364b897e4066d7a51eac"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.198854 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bb7580e0654c5430a3b593223b4fdaca5fbfe87bcfd0dc29ccb50fb95e8ad98f"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.210388 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"960fab3745ff8e989b25ca2abad3604a63f51098bf4c70342d0be6aa0f1526cf"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.210417 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"06b37bfad06824bb857825e6bd8f5937b267e53c355909c2386a44c283030854"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.210797 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.248153 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.248330 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.248374 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.249531 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.749514169 +0000 UTC m=+219.088843434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.250711 4626 scope.go:117] "RemoveContainer" containerID="ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.255940 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1\": container with ID starting with ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1 not found: ID does not exist" containerID="ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.255982 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1"} err="failed to get container status \"ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1\": rpc error: code = NotFound desc = could not find container \"ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1\": container with ID starting with ddfb71a87f8d345d5851c33a94d41e53959fec35d9e142ce355d72d5778122e1 not found: ID does not exist" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.256163 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" event={"ID":"7ec90488-5451-4df4-b9d6-926ced491b80","Type":"ContainerStarted","Data":"9d2a91446ef5429ea8ee2bbf6dbb227eb2187878b16b8e985fdda95b100da5dd"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.256195 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" event={"ID":"7ec90488-5451-4df4-b9d6-926ced491b80","Type":"ContainerStarted","Data":"8086a9587beb92e7097cec3ffcb127db9c6264d21fb9a59764351d595815d8a7"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.280680 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" event={"ID":"db82ca2b-5eac-4858-8808-7b6e22af0e26","Type":"ContainerStarted","Data":"fdf674f80bac596aefdc58dc0e321888462ee1f800f8b4140fd917e5aa2b484e"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.285916 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zc6wr"] Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.287948 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zc6wr"] Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.295684 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.323218 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" event={"ID":"2aa0ff28-d27a-44d8-a182-dfcadfcdd3c3","Type":"ContainerStarted","Data":"1602fd22ee7e0bde366806db7887ba827d8009424a7b196d384627c0eaf5f1d7"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.333228 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ff821579268076bccdd9596d82e62a22ec768c3f8e147bf52545162e64d5e559"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.349444 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.349654 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.849632046 +0000 UTC m=+219.188961302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.349735 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.349873 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.349905 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.349962 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.351556 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.851543615 +0000 UTC m=+219.190872881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.366852 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" event={"ID":"fbcd0737-551b-4d4b-bd69-ab6e324fe199","Type":"ContainerStarted","Data":"57903e200cb60336b9064d8a2d525084cdc566bfb261423140fe21ac64c42297"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.366894 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" event={"ID":"fbcd0737-551b-4d4b-bd69-ab6e324fe199","Type":"ContainerStarted","Data":"803d4fdecc88755b4f1f84f8949432db87d2c9f04276da481b4e17f584378889"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.367004 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.370838 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" event={"ID":"b4bf205f-74bd-4aa7-879c-f034a0fd8465","Type":"ContainerStarted","Data":"6a3b337084bad9376ac5f6352be726f12cf6cd7dc5930fbe2c6fe9e49348b9ae"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.382334 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.383441 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" event={"ID":"b33493db-9d18-4fd8-936a-896a8cbc16c3","Type":"ContainerStarted","Data":"3ab4b114d06763f057b950832c2be0245e66e25def04889292648c9127185987"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.384094 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.389052 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.402663 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" event={"ID":"2b1c4fc7-c55b-4360-ab3f-6a54be967dc8","Type":"ContainerStarted","Data":"5ef88c1ed5921dde53e95e19b22b0b8081827809e80979e9aafc9d0722321b13"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.414547 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rtqxx" event={"ID":"81b9ec25-c107-4b28-9b74-bc36564db58c","Type":"ContainerStarted","Data":"c001a2077be415809f01d3fb6b203d839b96e5caf7502c72f898820bfd7a4a64"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.416253 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" event={"ID":"7728d23e-9cc9-424f-8773-a55ba1e7b940","Type":"ContainerStarted","Data":"e35fe37499912326381d92b7705e75f477b4adcce9a2fa9770caf2c72b4e83c8"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.417631 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" event={"ID":"03b3a2ec-af7c-4589-b9ca-6d6b8ce276f8","Type":"ContainerStarted","Data":"6b9679cd67b40ff4ce40b909c2ef2a0ed2b4c35343d396ba277de506728ea007"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.434325 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" event={"ID":"45889c6c-eea4-447e-85a6-8045ba5a3fae","Type":"ContainerStarted","Data":"f0865c86f14986ada64b12d9b7a3d8638827660877b1bcbf06283bac09b56d67"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.434434 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q9kk6" podStartSLOduration=150.434414584 podStartE2EDuration="2m30.434414584s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.43161937 +0000 UTC m=+218.770948636" watchObservedRunningTime="2026-02-23 06:44:26.434414584 +0000 UTC m=+218.773743850" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.440330 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.450722 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.451235 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.452157 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:26.952140916 +0000 UTC m=+219.291470182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.458840 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kn9c" podStartSLOduration=150.458825884 podStartE2EDuration="2m30.458825884s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.457662313 +0000 UTC m=+218.796991580" watchObservedRunningTime="2026-02-23 06:44:26.458825884 +0000 UTC m=+218.798155151" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.463176 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pj6n8" event={"ID":"ca0fe033-a085-405b-a425-3ddc1f5e7e39","Type":"ContainerStarted","Data":"f733be114adf393d763e586ad22003658c603e373c7be67704890805c1637176"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.463558 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.489811 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" event={"ID":"439f9126-bf49-4c43-aef4-c993cd5d818f","Type":"ContainerStarted","Data":"d84e252667ae5bf4fe89a50aad75c7c2f810976274291e08e7bb760b5117255d"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.489850 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" event={"ID":"439f9126-bf49-4c43-aef4-c993cd5d818f","Type":"ContainerStarted","Data":"2eadf0ea11ca15b822cff10544818df8915ffde55cc349f5c36bac07dbb82bc4"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.504354 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cczv4" podStartSLOduration=150.504342118 podStartE2EDuration="2m30.504342118s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.503793616 +0000 UTC m=+218.843122881" watchObservedRunningTime="2026-02-23 06:44:26.504342118 +0000 UTC m=+218.843671384" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.521401 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" event={"ID":"f7181557-2eef-41be-9c83-ae2f0a1cfbfe","Type":"ContainerStarted","Data":"df3a17cc1fe8288717b656192149ba64019c60734990c9326f263a5a2dcf4e5d"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.543726 4626 generic.go:334] "Generic (PLEG): container finished" podID="94e222b5-d52d-46c9-ac45-1a0158bdd383" containerID="fcb01a655dc31feeff0ec023e6d5a233c652ac6f9356503888fa290e4077a519" exitCode=0 Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.543788 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" event={"ID":"94e222b5-d52d-46c9-ac45-1a0158bdd383","Type":"ContainerDied","Data":"fcb01a655dc31feeff0ec023e6d5a233c652ac6f9356503888fa290e4077a519"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.543816 4626 scope.go:117] "RemoveContainer" containerID="fcb01a655dc31feeff0ec023e6d5a233c652ac6f9356503888fa290e4077a519" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.543885 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.555346 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e222b5-d52d-46c9-ac45-1a0158bdd383-serving-cert\") pod \"94e222b5-d52d-46c9-ac45-1a0158bdd383\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.555610 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-config\") pod \"94e222b5-d52d-46c9-ac45-1a0158bdd383\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.555669 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9f4h\" (UniqueName: \"kubernetes.io/projected/94e222b5-d52d-46c9-ac45-1a0158bdd383-kube-api-access-w9f4h\") pod \"94e222b5-d52d-46c9-ac45-1a0158bdd383\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.555708 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-client-ca\") pod \"94e222b5-d52d-46c9-ac45-1a0158bdd383\" (UID: \"94e222b5-d52d-46c9-ac45-1a0158bdd383\") " Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.556025 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.557852 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-config" (OuterVolumeSpecName: "config") pod "94e222b5-d52d-46c9-ac45-1a0158bdd383" (UID: "94e222b5-d52d-46c9-ac45-1a0158bdd383"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.558849 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-client-ca" (OuterVolumeSpecName: "client-ca") pod "94e222b5-d52d-46c9-ac45-1a0158bdd383" (UID: "94e222b5-d52d-46c9-ac45-1a0158bdd383"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.559723 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.0597076 +0000 UTC m=+219.399036866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.582824 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e222b5-d52d-46c9-ac45-1a0158bdd383-kube-api-access-w9f4h" (OuterVolumeSpecName: "kube-api-access-w9f4h") pod "94e222b5-d52d-46c9-ac45-1a0158bdd383" (UID: "94e222b5-d52d-46c9-ac45-1a0158bdd383"). InnerVolumeSpecName "kube-api-access-w9f4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.584720 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pj6n8" podStartSLOduration=8.584707077000001 podStartE2EDuration="8.584707077s" podCreationTimestamp="2026-02-23 06:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.53470669 +0000 UTC m=+218.874035956" watchObservedRunningTime="2026-02-23 06:44:26.584707077 +0000 UTC m=+218.924036344" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.586602 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e222b5-d52d-46c9-ac45-1a0158bdd383-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "94e222b5-d52d-46c9-ac45-1a0158bdd383" (UID: "94e222b5-d52d-46c9-ac45-1a0158bdd383"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.605009 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" podStartSLOduration=150.604997538 podStartE2EDuration="2m30.604997538s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.587102088 +0000 UTC m=+218.926431354" watchObservedRunningTime="2026-02-23 06:44:26.604997538 +0000 UTC m=+218.944326804" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.608199 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" event={"ID":"c535fdcc-06de-4e87-a104-962f422a8df0","Type":"ContainerStarted","Data":"16ca49b4b7c54fb949aa00d10009f61cd4da252483795894f18c72bf23cf955d"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.608245 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" event={"ID":"c535fdcc-06de-4e87-a104-962f422a8df0","Type":"ContainerStarted","Data":"402776bbedc131d0363f07240cfdc3aef6cd3c26c0907d0608e369376b4ac30a"} Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.616432 4626 patch_prober.go:28] interesting pod/downloads-7954f5f757-b4q5j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.616477 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b4q5j" podUID="0535917b-9b6d-486b-b932-964a18be9e51" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.617212 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.660857 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" podStartSLOduration=151.660845217 podStartE2EDuration="2m31.660845217s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.626842916 +0000 UTC m=+218.966172182" watchObservedRunningTime="2026-02-23 06:44:26.660845217 +0000 UTC m=+219.000174483" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.661965 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.661978 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wbvgp" podStartSLOduration=150.661971148 podStartE2EDuration="2m30.661971148s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.659650568 +0000 UTC m=+218.998979834" watchObservedRunningTime="2026-02-23 06:44:26.661971148 +0000 UTC m=+219.001300414" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.662387 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.162369227 +0000 UTC m=+219.501698493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.662995 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.663214 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e222b5-d52d-46c9-ac45-1a0158bdd383-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.663235 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.663245 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9f4h\" (UniqueName: \"kubernetes.io/projected/94e222b5-d52d-46c9-ac45-1a0158bdd383-kube-api-access-w9f4h\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.663255 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94e222b5-d52d-46c9-ac45-1a0158bdd383-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.664325 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.164315082 +0000 UTC m=+219.503644348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.682804 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" podStartSLOduration=150.682794661 podStartE2EDuration="2m30.682794661s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.681900527 +0000 UTC m=+219.021229793" watchObservedRunningTime="2026-02-23 06:44:26.682794661 +0000 UTC m=+219.022123927" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.710665 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:26 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:26 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:26 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.710748 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.764999 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.766231 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.266213532 +0000 UTC m=+219.605542798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.867301 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.867791 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.36777672 +0000 UTC m=+219.707105986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.934362 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s4szw" podStartSLOduration=150.934342494 podStartE2EDuration="2m30.934342494s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.929715921 +0000 UTC m=+219.269045187" watchObservedRunningTime="2026-02-23 06:44:26.934342494 +0000 UTC m=+219.273671760" Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.968355 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.968526 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.468487714 +0000 UTC m=+219.807816981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.968719 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:26 crc kubenswrapper[4626]: E0223 06:44:26.968996 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.468988007 +0000 UTC m=+219.808317272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:26 crc kubenswrapper[4626]: I0223 06:44:26.990936 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cqdrn" podStartSLOduration=150.990900892 podStartE2EDuration="2m30.990900892s" podCreationTimestamp="2026-02-23 06:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:26.98921116 +0000 UTC m=+219.328540426" watchObservedRunningTime="2026-02-23 06:44:26.990900892 +0000 UTC m=+219.330230158" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.073879 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.073979 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.573962218 +0000 UTC m=+219.913291485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.074291 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.074588 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.574580142 +0000 UTC m=+219.913909408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.170194 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-n5sj6" podStartSLOduration=152.170176424 podStartE2EDuration="2m32.170176424s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:27.073292687 +0000 UTC m=+219.412621954" watchObservedRunningTime="2026-02-23 06:44:27.170176424 +0000 UTC m=+219.509505690" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.177283 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.178072 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.6780544 +0000 UTC m=+220.017383666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.196350 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.207963 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-97rfn"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.279721 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.280237 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.78022401 +0000 UTC m=+220.119553267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.380749 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.380940 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.880894238 +0000 UTC m=+220.220223504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.381093 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.381411 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.881395482 +0000 UTC m=+220.220724748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.436073 4626 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8nxxc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.436129 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" podUID="45889c6c-eea4-447e-85a6-8045ba5a3fae" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.473057 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wb2fx"] Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.473274 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e222b5-d52d-46c9-ac45-1a0158bdd383" containerName="route-controller-manager" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.473293 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e222b5-d52d-46c9-ac45-1a0158bdd383" containerName="route-controller-manager" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.473382 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e222b5-d52d-46c9-ac45-1a0158bdd383" containerName="route-controller-manager" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.474072 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.479260 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.482467 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.482600 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.98257606 +0000 UTC m=+220.321905326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.482786 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.483094 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:27.983081061 +0000 UTC m=+220.322410327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.509212 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9697bfb97-sfbbg"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.515083 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.516588 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb2fx"] Feb 23 06:44:27 crc kubenswrapper[4626]: W0223 06:44:27.539943 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870cbdb2_daed_4802_a6fb_77d817f68d07.slice/crio-36b94649b8cc3a33f308f3a1b072908f2331fea283a3d04c46759640380a7c43 WatchSource:0}: Error finding container 36b94649b8cc3a33f308f3a1b072908f2331fea283a3d04c46759640380a7c43: Status 404 returned error can't find the container with id 36b94649b8cc3a33f308f3a1b072908f2331fea283a3d04c46759640380a7c43 Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.586353 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.586645 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-catalog-content\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.586679 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-utilities\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.586705 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hcb\" (UniqueName: \"kubernetes.io/projected/b3028805-3229-4cc3-9e20-2bca252b2c19-kube-api-access-v7hcb\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.586809 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.086792495 +0000 UTC m=+220.426121761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.640733 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4bbde41b-950b-49d8-97eb-470ca9ebfdb8","Type":"ContainerStarted","Data":"b16e14d7cf0fe3418c926a72ae06d59b8815ed641b7d45b1b0a27466508d9676"} Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.648291 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" event={"ID":"b4bf205f-74bd-4aa7-879c-f034a0fd8465","Type":"ContainerStarted","Data":"d2fc0fc2701d4513adee7c501f086abdd184b475f8ed80bd61031ca00bd4e3eb"} Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.648350 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" event={"ID":"b4bf205f-74bd-4aa7-879c-f034a0fd8465","Type":"ContainerStarted","Data":"47fb245f48b14e23379965517e5886d7db85c6a3d74a570825ccf5f5ee466c87"} Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.652202 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" event={"ID":"870cbdb2-daed-4802-a6fb-77d817f68d07","Type":"ContainerStarted","Data":"36b94649b8cc3a33f308f3a1b072908f2331fea283a3d04c46759640380a7c43"} Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.653756 4626 patch_prober.go:28] interesting pod/downloads-7954f5f757-b4q5j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.653811 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b4q5j" podUID="0535917b-9b6d-486b-b932-964a18be9e51" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.659801 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8nxxc" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.676446 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k27xh" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.678731 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swmdj"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.681107 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.688624 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.688967 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hcb\" (UniqueName: \"kubernetes.io/projected/b3028805-3229-4cc3-9e20-2bca252b2c19-kube-api-access-v7hcb\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.689009 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.689115 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-catalog-content\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.689148 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-utilities\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.695039 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.194995257 +0000 UTC m=+220.534324523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.726946 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-catalog-content\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.728680 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:27 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:27 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:27 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.728742 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.730231 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-utilities\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.730296 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swmdj"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.746949 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hcb\" (UniqueName: \"kubernetes.io/projected/b3028805-3229-4cc3-9e20-2bca252b2c19-kube-api-access-v7hcb\") pod \"community-operators-wb2fx\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.789013 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.814934 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.816291 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-catalog-content\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.816402 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-utilities\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.817005 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65kcn\" (UniqueName: \"kubernetes.io/projected/a1530c30-549e-4d85-b7ee-086832420311-kube-api-access-65kcn\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.817959 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.317934951 +0000 UTC m=+220.657264217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.878838 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t66n4"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.880790 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.897712 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t66n4"] Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.920515 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65kcn\" (UniqueName: \"kubernetes.io/projected/a1530c30-549e-4d85-b7ee-086832420311-kube-api-access-65kcn\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.920568 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.920623 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-catalog-content\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.920647 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-utilities\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.921034 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-utilities\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: E0223 06:44:27.921606 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.421590851 +0000 UTC m=+220.760920117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.921968 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-catalog-content\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:27 crc kubenswrapper[4626]: I0223 06:44:27.964171 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65kcn\" (UniqueName: \"kubernetes.io/projected/a1530c30-549e-4d85-b7ee-086832420311-kube-api-access-65kcn\") pod \"certified-operators-swmdj\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.005542 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1957912e-d933-428f-98b3-65bb43ca2ad0" path="/var/lib/kubelet/pods/1957912e-d933-428f-98b3-65bb43ca2ad0/volumes" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.006265 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e222b5-d52d-46c9-ac45-1a0158bdd383" path="/var/lib/kubelet/pods/94e222b5-d52d-46c9-ac45-1a0158bdd383/volumes" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.021721 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.022232 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj87f\" (UniqueName: \"kubernetes.io/projected/1f7808e4-0cf2-47e0-aa06-62282f4111db-kube-api-access-vj87f\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: E0223 06:44:28.022324 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.522304761 +0000 UTC m=+220.861634026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.022363 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-utilities\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.022406 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-catalog-content\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.046904 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.050925 4626 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.082524 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p8ddh"] Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.083540 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.092110 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8ddh"] Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.124416 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj87f\" (UniqueName: \"kubernetes.io/projected/1f7808e4-0cf2-47e0-aa06-62282f4111db-kube-api-access-vj87f\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.124489 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-utilities\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.124546 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-catalog-content\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.124611 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:28 crc kubenswrapper[4626]: E0223 06:44:28.124942 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.624931161 +0000 UTC m=+220.964260427 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xgpg7" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.125903 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-utilities\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.126195 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-catalog-content\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.155031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj87f\" (UniqueName: \"kubernetes.io/projected/1f7808e4-0cf2-47e0-aa06-62282f4111db-kube-api-access-vj87f\") pod \"community-operators-t66n4\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.190443 4626 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-23T06:44:28.050943641Z","Handler":null,"Name":""} Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.217046 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.228107 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.228597 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-utilities\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.228678 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-catalog-content\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.228714 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsdbd\" (UniqueName: \"kubernetes.io/projected/850794ce-0d65-45b4-81be-15dd522ab8fc-kube-api-access-xsdbd\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: E0223 06:44:28.228869 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:44:28.728844896 +0000 UTC m=+221.068174163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.243602 4626 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.243633 4626 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.330940 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-utilities\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.331022 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.331086 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-catalog-content\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.331141 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsdbd\" (UniqueName: \"kubernetes.io/projected/850794ce-0d65-45b4-81be-15dd522ab8fc-kube-api-access-xsdbd\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.333922 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-utilities\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.335692 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-catalog-content\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.355194 4626 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.355237 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.374238 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsdbd\" (UniqueName: \"kubernetes.io/projected/850794ce-0d65-45b4-81be-15dd522ab8fc-kube-api-access-xsdbd\") pod \"certified-operators-p8ddh\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.395304 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg"] Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.396250 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.421922 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.422004 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.422096 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.422277 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.422491 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.422835 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.424799 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg"] Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.435095 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.484580 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xgpg7\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.501235 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.537824 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.538148 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvtxf\" (UniqueName: \"kubernetes.io/projected/f329f420-1c05-4b54-8de7-7f03d9cd78e4-kube-api-access-wvtxf\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.538181 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f329f420-1c05-4b54-8de7-7f03d9cd78e4-serving-cert\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.538218 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-client-ca\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.538281 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-config\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.593314 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.616881 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wb2fx"] Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.639133 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvtxf\" (UniqueName: \"kubernetes.io/projected/f329f420-1c05-4b54-8de7-7f03d9cd78e4-kube-api-access-wvtxf\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.639171 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f329f420-1c05-4b54-8de7-7f03d9cd78e4-serving-cert\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.639198 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-client-ca\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.639243 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-config\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.640283 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-config\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.642171 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-client-ca\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.662355 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvtxf\" (UniqueName: \"kubernetes.io/projected/f329f420-1c05-4b54-8de7-7f03d9cd78e4-kube-api-access-wvtxf\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.662352 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f329f420-1c05-4b54-8de7-7f03d9cd78e4-serving-cert\") pod \"route-controller-manager-6d8f7958cb-fmdvg\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.666085 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swmdj"] Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.693327 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4bbde41b-950b-49d8-97eb-470ca9ebfdb8","Type":"ContainerStarted","Data":"5ee04dac5bbc2491f4b869bc9ddb489fc0f8cde25ca78b2fdcca7016b186f0c6"} Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.713300 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" event={"ID":"b4bf205f-74bd-4aa7-879c-f034a0fd8465","Type":"ContainerStarted","Data":"02332624aafb1fdb973279147c54b8c27ee32386b8bbbbb80b8a0d52b56fdebe"} Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.713746 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.713728345 podStartE2EDuration="2.713728345s" podCreationTimestamp="2026-02-23 06:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:28.708763285 +0000 UTC m=+221.048092551" watchObservedRunningTime="2026-02-23 06:44:28.713728345 +0000 UTC m=+221.053057611" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.734133 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" event={"ID":"870cbdb2-daed-4802-a6fb-77d817f68d07","Type":"ContainerStarted","Data":"40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011"} Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.734177 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.761789 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.788984 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.791686 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:28 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:28 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:28 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.791718 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.811856 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fjnzs" podStartSLOduration=10.81183828 podStartE2EDuration="10.81183828s" podCreationTimestamp="2026-02-23 06:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:28.752380142 +0000 UTC m=+221.091709408" watchObservedRunningTime="2026-02-23 06:44:28.81183828 +0000 UTC m=+221.151167537" Feb 23 06:44:28 crc kubenswrapper[4626]: I0223 06:44:28.865842 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" podStartSLOduration=5.865823853 podStartE2EDuration="5.865823853s" podCreationTimestamp="2026-02-23 06:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:28.813037529 +0000 UTC m=+221.152366785" watchObservedRunningTime="2026-02-23 06:44:28.865823853 +0000 UTC m=+221.205153119" Feb 23 06:44:28 crc kubenswrapper[4626]: E0223 06:44:28.934234 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod4bbde41b_950b_49d8_97eb_470ca9ebfdb8.slice/crio-5ee04dac5bbc2491f4b869bc9ddb489fc0f8cde25ca78b2fdcca7016b186f0c6.scope\": RecentStats: unable to find data in memory cache]" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.041332 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p8ddh"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.123310 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t66n4"] Feb 23 06:44:29 crc kubenswrapper[4626]: W0223 06:44:29.140816 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7808e4_0cf2_47e0_aa06_62282f4111db.slice/crio-529243a204f9102a604fd11720b5e4964bce5b14360a2e69fd26ade01930c54a WatchSource:0}: Error finding container 529243a204f9102a604fd11720b5e4964bce5b14360a2e69fd26ade01930c54a: Status 404 returned error can't find the container with id 529243a204f9102a604fd11720b5e4964bce5b14360a2e69fd26ade01930c54a Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.238221 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xgpg7"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.427211 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.506209 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wvhss"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.512269 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.515521 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.521975 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvhss"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.673139 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qlg6\" (UniqueName: \"kubernetes.io/projected/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-kube-api-access-7qlg6\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.673227 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-catalog-content\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.673250 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-utilities\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.701937 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:29 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:29 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:29 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.702029 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.738657 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" event={"ID":"f329f420-1c05-4b54-8de7-7f03d9cd78e4","Type":"ContainerStarted","Data":"f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.738698 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" event={"ID":"f329f420-1c05-4b54-8de7-7f03d9cd78e4","Type":"ContainerStarted","Data":"a79048a229894f6bac8e16afb9662e5bf47c55892401cf01bc88eb8154e0cf57"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.739457 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.740736 4626 patch_prober.go:28] interesting pod/route-controller-manager-6d8f7958cb-fmdvg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.740769 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" podUID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.741573 4626 generic.go:334] "Generic (PLEG): container finished" podID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerID="0aa5ad3c06fe369a8cc17b6f385634c5c5e2bfb0ee5bd51d7f8798ba788880b9" exitCode=0 Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.741603 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8ddh" event={"ID":"850794ce-0d65-45b4-81be-15dd522ab8fc","Type":"ContainerDied","Data":"0aa5ad3c06fe369a8cc17b6f385634c5c5e2bfb0ee5bd51d7f8798ba788880b9"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.741645 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8ddh" event={"ID":"850794ce-0d65-45b4-81be-15dd522ab8fc","Type":"ContainerStarted","Data":"b54c993cbab9f63e60d5b8097dab0ffaa664a6330a6c16c976ee143bdbc1c633"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.743144 4626 generic.go:334] "Generic (PLEG): container finished" podID="4bbde41b-950b-49d8-97eb-470ca9ebfdb8" containerID="5ee04dac5bbc2491f4b869bc9ddb489fc0f8cde25ca78b2fdcca7016b186f0c6" exitCode=0 Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.743185 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4bbde41b-950b-49d8-97eb-470ca9ebfdb8","Type":"ContainerDied","Data":"5ee04dac5bbc2491f4b869bc9ddb489fc0f8cde25ca78b2fdcca7016b186f0c6"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.743796 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.744775 4626 generic.go:334] "Generic (PLEG): container finished" podID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerID="dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08" exitCode=0 Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.744814 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb2fx" event={"ID":"b3028805-3229-4cc3-9e20-2bca252b2c19","Type":"ContainerDied","Data":"dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.744847 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb2fx" event={"ID":"b3028805-3229-4cc3-9e20-2bca252b2c19","Type":"ContainerStarted","Data":"0faa0efbc074cbb99ce742ae87ca0fd1808829c553459238ee7fa4f1fcc11886"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.747673 4626 generic.go:334] "Generic (PLEG): container finished" podID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerID="e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3" exitCode=0 Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.747776 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t66n4" event={"ID":"1f7808e4-0cf2-47e0-aa06-62282f4111db","Type":"ContainerDied","Data":"e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.748117 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t66n4" event={"ID":"1f7808e4-0cf2-47e0-aa06-62282f4111db","Type":"ContainerStarted","Data":"529243a204f9102a604fd11720b5e4964bce5b14360a2e69fd26ade01930c54a"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.749352 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" event={"ID":"7e5e4e49-7bad-4e99-a662-b0f4ca041477","Type":"ContainerStarted","Data":"395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.749400 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" event={"ID":"7e5e4e49-7bad-4e99-a662-b0f4ca041477","Type":"ContainerStarted","Data":"da776f2d23e6183ae26feda02b1e21899237587657bfda4b02531ad25964ab9b"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.749490 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.750529 4626 generic.go:334] "Generic (PLEG): container finished" podID="a1530c30-549e-4d85-b7ee-086832420311" containerID="82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4" exitCode=0 Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.750584 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swmdj" event={"ID":"a1530c30-549e-4d85-b7ee-086832420311","Type":"ContainerDied","Data":"82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.750946 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swmdj" event={"ID":"a1530c30-549e-4d85-b7ee-086832420311","Type":"ContainerStarted","Data":"4bfeb70ce0dd8f3e9204f3cc51f4657f13e47f9cf63c077758cd4c51bdb6360e"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.774193 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qlg6\" (UniqueName: \"kubernetes.io/projected/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-kube-api-access-7qlg6\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.774263 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-catalog-content\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.774282 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-utilities\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.774682 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-utilities\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.775062 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-catalog-content\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.778780 4626 generic.go:334] "Generic (PLEG): container finished" podID="7de9fe51-8926-4966-85f1-b14c16db8a74" containerID="86a087b59bd61ad4432a7d8990523be0a09a3135643c8e4a561d95d22cafe9ab" exitCode=0 Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.779031 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" event={"ID":"7de9fe51-8926-4966-85f1-b14c16db8a74","Type":"ContainerDied","Data":"86a087b59bd61ad4432a7d8990523be0a09a3135643c8e4a561d95d22cafe9ab"} Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.806209 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qlg6\" (UniqueName: \"kubernetes.io/projected/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-kube-api-access-7qlg6\") pod \"redhat-marketplace-wvhss\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.848863 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.865601 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" podStartSLOduration=6.865581618 podStartE2EDuration="6.865581618s" podCreationTimestamp="2026-02-23 06:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:29.800343515 +0000 UTC m=+222.139672771" watchObservedRunningTime="2026-02-23 06:44:29.865581618 +0000 UTC m=+222.204910884" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.893696 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9ps6z"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.894748 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.934925 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ps6z"] Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.959431 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" podStartSLOduration=154.959415481 podStartE2EDuration="2m34.959415481s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:44:29.958094624 +0000 UTC m=+222.297423889" watchObservedRunningTime="2026-02-23 06:44:29.959415481 +0000 UTC m=+222.298744736" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.978041 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b8sm\" (UniqueName: \"kubernetes.io/projected/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-kube-api-access-7b8sm\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.978117 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-catalog-content\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.978190 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-utilities\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:29 crc kubenswrapper[4626]: I0223 06:44:29.993471 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.079380 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-catalog-content\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.079665 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-utilities\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.079796 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b8sm\" (UniqueName: \"kubernetes.io/projected/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-kube-api-access-7b8sm\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.080437 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-catalog-content\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.080677 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-utilities\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.112120 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b8sm\" (UniqueName: \"kubernetes.io/projected/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-kube-api-access-7b8sm\") pod \"redhat-marketplace-9ps6z\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.210202 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.259937 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.259986 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.261913 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvhss"] Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.281542 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.321316 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.321342 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.322562 4626 patch_prober.go:28] interesting pod/console-f9d7485db-ktkzv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.322598 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ktkzv" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.426333 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.433862 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.446228 4626 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mpcw2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]log ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]etcd ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/generic-apiserver-start-informers ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/max-in-flight-filter ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 23 06:44:30 crc kubenswrapper[4626]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/project.openshift.io-projectcache ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/openshift.io-startinformers ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 23 06:44:30 crc kubenswrapper[4626]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 23 06:44:30 crc kubenswrapper[4626]: livez check failed Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.446274 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" podUID="439f9126-bf49-4c43-aef4-c993cd5d818f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.698750 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6rbtx"] Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.699660 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.700090 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.705744 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:30 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:30 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:30 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.705792 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.706671 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.736933 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rbtx"] Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.775475 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ps6z"] Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.796762 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-utilities\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.796886 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-catalog-content\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.797145 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzzgw\" (UniqueName: \"kubernetes.io/projected/ad17db5e-dbea-4e58-94b2-6a897f475993-kube-api-access-fzzgw\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: W0223 06:44:30.831205 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb157932b_4d8e_46fd_8e45_f7e9a8f0cd91.slice/crio-c6a0a43be667556bbaac123f691b2b9806c3ac9fce9edf0dc05732dff5752eae WatchSource:0}: Error finding container c6a0a43be667556bbaac123f691b2b9806c3ac9fce9edf0dc05732dff5752eae: Status 404 returned error can't find the container with id c6a0a43be667556bbaac123f691b2b9806c3ac9fce9edf0dc05732dff5752eae Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.831318 4626 generic.go:334] "Generic (PLEG): container finished" podID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerID="54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092" exitCode=0 Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.833248 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvhss" event={"ID":"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5","Type":"ContainerDied","Data":"54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092"} Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.833288 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvhss" event={"ID":"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5","Type":"ContainerStarted","Data":"b9dd0c3a15c251341ad4364af13a0f16faac424b6a02e93ed9e600d740a4425c"} Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.850011 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rqbt5" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.851921 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.902217 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-catalog-content\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.902310 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzzgw\" (UniqueName: \"kubernetes.io/projected/ad17db5e-dbea-4e58-94b2-6a897f475993-kube-api-access-fzzgw\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.902444 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-utilities\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.902820 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-utilities\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.904284 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-catalog-content\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:30 crc kubenswrapper[4626]: I0223 06:44:30.983035 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzzgw\" (UniqueName: \"kubernetes.io/projected/ad17db5e-dbea-4e58-94b2-6a897f475993-kube-api-access-fzzgw\") pod \"redhat-operators-6rbtx\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.020603 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.042620 4626 patch_prober.go:28] interesting pod/downloads-7954f5f757-b4q5j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.042666 4626 patch_prober.go:28] interesting pod/downloads-7954f5f757-b4q5j container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.042679 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b4q5j" podUID="0535917b-9b6d-486b-b932-964a18be9e51" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.042720 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b4q5j" podUID="0535917b-9b6d-486b-b932-964a18be9e51" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.094610 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bn5h9"] Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.095579 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.134945 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn5h9"] Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.208353 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-catalog-content\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.208470 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-utilities\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.208532 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9bdc\" (UniqueName: \"kubernetes.io/projected/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-kube-api-access-c9bdc\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.310544 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-catalog-content\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.311187 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-utilities\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.311233 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9bdc\" (UniqueName: \"kubernetes.io/projected/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-kube-api-access-c9bdc\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.312033 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-catalog-content\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.312296 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-utilities\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.375227 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9bdc\" (UniqueName: \"kubernetes.io/projected/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-kube-api-access-c9bdc\") pod \"redhat-operators-bn5h9\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.427809 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.541168 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.542117 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.557934 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.563772 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.569886 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.583992 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.617964 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kube-api-access\") pod \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.618181 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kubelet-dir\") pod \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\" (UID: \"4bbde41b-950b-49d8-97eb-470ca9ebfdb8\") " Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.618541 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.618586 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.618757 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4bbde41b-950b-49d8-97eb-470ca9ebfdb8" (UID: "4bbde41b-950b-49d8-97eb-470ca9ebfdb8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.641457 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4bbde41b-950b-49d8-97eb-470ca9ebfdb8" (UID: "4bbde41b-950b-49d8-97eb-470ca9ebfdb8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.721368 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.721649 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.721721 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.721733 4626 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bbde41b-950b-49d8-97eb-470ca9ebfdb8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.722033 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.732644 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:31 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:31 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:31 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.732686 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.771461 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.877119 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.907664 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.909655 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4bbde41b-950b-49d8-97eb-470ca9ebfdb8","Type":"ContainerDied","Data":"b16e14d7cf0fe3418c926a72ae06d59b8815ed641b7d45b1b0a27466508d9676"} Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.909720 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b16e14d7cf0fe3418c926a72ae06d59b8815ed641b7d45b1b0a27466508d9676" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.909666 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.918762 4626 generic.go:334] "Generic (PLEG): container finished" podID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerID="9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e" exitCode=0 Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.920145 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ps6z" event={"ID":"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91","Type":"ContainerDied","Data":"9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e"} Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.920180 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ps6z" event={"ID":"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91","Type":"ContainerStarted","Data":"c6a0a43be667556bbaac123f691b2b9806c3ac9fce9edf0dc05732dff5752eae"} Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.923371 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7de9fe51-8926-4966-85f1-b14c16db8a74-config-volume\") pod \"7de9fe51-8926-4966-85f1-b14c16db8a74\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.926065 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de9fe51-8926-4966-85f1-b14c16db8a74-config-volume" (OuterVolumeSpecName: "config-volume") pod "7de9fe51-8926-4966-85f1-b14c16db8a74" (UID: "7de9fe51-8926-4966-85f1-b14c16db8a74"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.934611 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjmvt\" (UniqueName: \"kubernetes.io/projected/7de9fe51-8926-4966-85f1-b14c16db8a74-kube-api-access-wjmvt\") pod \"7de9fe51-8926-4966-85f1-b14c16db8a74\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.934708 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7de9fe51-8926-4966-85f1-b14c16db8a74-secret-volume\") pod \"7de9fe51-8926-4966-85f1-b14c16db8a74\" (UID: \"7de9fe51-8926-4966-85f1-b14c16db8a74\") " Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.934976 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7de9fe51-8926-4966-85f1-b14c16db8a74-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.951908 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de9fe51-8926-4966-85f1-b14c16db8a74-kube-api-access-wjmvt" (OuterVolumeSpecName: "kube-api-access-wjmvt") pod "7de9fe51-8926-4966-85f1-b14c16db8a74" (UID: "7de9fe51-8926-4966-85f1-b14c16db8a74"). InnerVolumeSpecName "kube-api-access-wjmvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:44:31 crc kubenswrapper[4626]: I0223 06:44:31.952101 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de9fe51-8926-4966-85f1-b14c16db8a74-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7de9fe51-8926-4966-85f1-b14c16db8a74" (UID: "7de9fe51-8926-4966-85f1-b14c16db8a74"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.026114 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rbtx"] Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.037923 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjmvt\" (UniqueName: \"kubernetes.io/projected/7de9fe51-8926-4966-85f1-b14c16db8a74-kube-api-access-wjmvt\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.040224 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7de9fe51-8926-4966-85f1-b14c16db8a74-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:32 crc kubenswrapper[4626]: W0223 06:44:32.059181 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad17db5e_dbea_4e58_94b2_6a897f475993.slice/crio-0c981a300349b3220bfa6ae37ebcf75358b8885e637199ccb5a96d06ed4b0ff5 WatchSource:0}: Error finding container 0c981a300349b3220bfa6ae37ebcf75358b8885e637199ccb5a96d06ed4b0ff5: Status 404 returned error can't find the container with id 0c981a300349b3220bfa6ae37ebcf75358b8885e637199ccb5a96d06ed4b0ff5 Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.106742 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn5h9"] Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.421195 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:44:32 crc kubenswrapper[4626]: W0223 06:44:32.457159 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3dc1a917_fdce_46c7_93c7_506c00dcdeac.slice/crio-8db5146e618a657565be3a83f0efba41c0c9f264b66a67b4aa703c1fc076be2f WatchSource:0}: Error finding container 8db5146e618a657565be3a83f0efba41c0c9f264b66a67b4aa703c1fc076be2f: Status 404 returned error can't find the container with id 8db5146e618a657565be3a83f0efba41c0c9f264b66a67b4aa703c1fc076be2f Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.703438 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:32 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:32 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:32 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.703753 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.945332 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3dc1a917-fdce-46c7-93c7-506c00dcdeac","Type":"ContainerStarted","Data":"8db5146e618a657565be3a83f0efba41c0c9f264b66a67b4aa703c1fc076be2f"} Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.952341 4626 generic.go:334] "Generic (PLEG): container finished" podID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerID="5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4" exitCode=0 Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.952405 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rbtx" event={"ID":"ad17db5e-dbea-4e58-94b2-6a897f475993","Type":"ContainerDied","Data":"5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4"} Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.952428 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rbtx" event={"ID":"ad17db5e-dbea-4e58-94b2-6a897f475993","Type":"ContainerStarted","Data":"0c981a300349b3220bfa6ae37ebcf75358b8885e637199ccb5a96d06ed4b0ff5"} Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.961134 4626 generic.go:334] "Generic (PLEG): container finished" podID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerID="25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0" exitCode=0 Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.961203 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerDied","Data":"25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0"} Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.961233 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerStarted","Data":"fa3d45d61ff27e5650c5ea5a99e03b7102abd0a6cb1730a64bead03ce723ca6d"} Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.979100 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.979216 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj" event={"ID":"7de9fe51-8926-4966-85f1-b14c16db8a74","Type":"ContainerDied","Data":"46bde7187f57119acc1f5d9bc4a590721e3476dea682619655290a0baafaa3c9"} Feb 23 06:44:32 crc kubenswrapper[4626]: I0223 06:44:32.979238 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46bde7187f57119acc1f5d9bc4a590721e3476dea682619655290a0baafaa3c9" Feb 23 06:44:33 crc kubenswrapper[4626]: I0223 06:44:33.702258 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:33 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:33 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:33 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:33 crc kubenswrapper[4626]: I0223 06:44:33.702586 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:34 crc kubenswrapper[4626]: I0223 06:44:34.010985 4626 generic.go:334] "Generic (PLEG): container finished" podID="3dc1a917-fdce-46c7-93c7-506c00dcdeac" containerID="0bb2eaedbdc12a37d55ebb99c48b4e2a897490075bb80357042e346d0e64ae39" exitCode=0 Feb 23 06:44:34 crc kubenswrapper[4626]: I0223 06:44:34.019008 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3dc1a917-fdce-46c7-93c7-506c00dcdeac","Type":"ContainerDied","Data":"0bb2eaedbdc12a37d55ebb99c48b4e2a897490075bb80357042e346d0e64ae39"} Feb 23 06:44:34 crc kubenswrapper[4626]: I0223 06:44:34.702596 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:34 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:34 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:34 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:34 crc kubenswrapper[4626]: I0223 06:44:34.702663 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.327683 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.411156 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kube-api-access\") pod \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.411207 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kubelet-dir\") pod \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\" (UID: \"3dc1a917-fdce-46c7-93c7-506c00dcdeac\") " Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.411453 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3dc1a917-fdce-46c7-93c7-506c00dcdeac" (UID: "3dc1a917-fdce-46c7-93c7-506c00dcdeac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.411703 4626 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.419551 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3dc1a917-fdce-46c7-93c7-506c00dcdeac" (UID: "3dc1a917-fdce-46c7-93c7-506c00dcdeac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.440911 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.460084 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mpcw2" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.515605 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dc1a917-fdce-46c7-93c7-506c00dcdeac-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.702026 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:35 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:35 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:35 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:35 crc kubenswrapper[4626]: I0223 06:44:35.702088 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:36 crc kubenswrapper[4626]: I0223 06:44:36.080563 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:44:36 crc kubenswrapper[4626]: I0223 06:44:36.080553 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3dc1a917-fdce-46c7-93c7-506c00dcdeac","Type":"ContainerDied","Data":"8db5146e618a657565be3a83f0efba41c0c9f264b66a67b4aa703c1fc076be2f"} Feb 23 06:44:36 crc kubenswrapper[4626]: I0223 06:44:36.080617 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8db5146e618a657565be3a83f0efba41c0c9f264b66a67b4aa703c1fc076be2f" Feb 23 06:44:36 crc kubenswrapper[4626]: I0223 06:44:36.146220 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pj6n8" Feb 23 06:44:36 crc kubenswrapper[4626]: I0223 06:44:36.701982 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:36 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:36 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:36 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:36 crc kubenswrapper[4626]: I0223 06:44:36.702689 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:37 crc kubenswrapper[4626]: I0223 06:44:37.701455 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:37 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:37 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:37 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:37 crc kubenswrapper[4626]: I0223 06:44:37.701808 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:38 crc kubenswrapper[4626]: I0223 06:44:38.702115 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:38 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:38 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:38 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:38 crc kubenswrapper[4626]: I0223 06:44:38.702180 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:39 crc kubenswrapper[4626]: I0223 06:44:39.703603 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:39 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:39 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:39 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:39 crc kubenswrapper[4626]: I0223 06:44:39.703705 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:40 crc kubenswrapper[4626]: I0223 06:44:40.321138 4626 patch_prober.go:28] interesting pod/console-f9d7485db-ktkzv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 23 06:44:40 crc kubenswrapper[4626]: I0223 06:44:40.321474 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ktkzv" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 23 06:44:40 crc kubenswrapper[4626]: I0223 06:44:40.704630 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:40 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:40 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:40 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:40 crc kubenswrapper[4626]: I0223 06:44:40.704706 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:41 crc kubenswrapper[4626]: I0223 06:44:41.042956 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-b4q5j" Feb 23 06:44:41 crc kubenswrapper[4626]: I0223 06:44:41.702405 4626 patch_prober.go:28] interesting pod/router-default-5444994796-4bfst container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:44:41 crc kubenswrapper[4626]: [-]has-synced failed: reason withheld Feb 23 06:44:41 crc kubenswrapper[4626]: [+]process-running ok Feb 23 06:44:41 crc kubenswrapper[4626]: healthz check failed Feb 23 06:44:41 crc kubenswrapper[4626]: I0223 06:44:41.702463 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4bfst" podUID="39684a26-7fad-4a8b-9621-99db77c9a01f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:44:42 crc kubenswrapper[4626]: I0223 06:44:42.703692 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:42 crc kubenswrapper[4626]: I0223 06:44:42.706538 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-4bfst" Feb 23 06:44:45 crc kubenswrapper[4626]: I0223 06:44:45.798134 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:45 crc kubenswrapper[4626]: I0223 06:44:45.803319 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b6af64-b3dc-44ae-96bd-90ab1b79dc08-metrics-certs\") pod \"network-metrics-daemon-ls5wf\" (UID: \"53b6af64-b3dc-44ae-96bd-90ab1b79dc08\") " pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:45 crc kubenswrapper[4626]: I0223 06:44:45.904427 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5wf" Feb 23 06:44:48 crc kubenswrapper[4626]: I0223 06:44:48.505896 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:44:50 crc kubenswrapper[4626]: I0223 06:44:50.325170 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:50 crc kubenswrapper[4626]: I0223 06:44:50.330444 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:44:55 crc kubenswrapper[4626]: I0223 06:44:55.686039 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:44:55 crc kubenswrapper[4626]: I0223 06:44:55.686921 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:44:57 crc kubenswrapper[4626]: E0223 06:44:57.118982 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 23 06:44:57 crc kubenswrapper[4626]: E0223 06:44:57.119487 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7hcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wb2fx_openshift-marketplace(b3028805-3229-4cc3-9e20-2bca252b2c19): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:44:57 crc kubenswrapper[4626]: E0223 06:44:57.120695 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wb2fx" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.339145 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wb2fx" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.424037 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.424261 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65kcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-swmdj_openshift-marketplace(a1530c30-549e-4d85-b7ee-086832420311): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.426189 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-swmdj" podUID="a1530c30-549e-4d85-b7ee-086832420311" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.437817 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.438444 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qlg6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wvhss_openshift-marketplace(dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.439696 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wvhss" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.453762 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.453950 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj87f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-t66n4_openshift-marketplace(1f7808e4-0cf2-47e0-aa06-62282f4111db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.455177 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-t66n4" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.476816 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.476924 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9bdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bn5h9_openshift-marketplace(e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.478618 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bn5h9" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.512678 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.513191 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xsdbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p8ddh_openshift-marketplace(850794ce-0d65-45b4-81be-15dd522ab8fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:44:59 crc kubenswrapper[4626]: E0223 06:44:59.518049 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p8ddh" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" Feb 23 06:44:59 crc kubenswrapper[4626]: I0223 06:44:59.804710 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ls5wf"] Feb 23 06:44:59 crc kubenswrapper[4626]: W0223 06:44:59.811787 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b6af64_b3dc_44ae_96bd_90ab1b79dc08.slice/crio-f82feb457cf1233ccd8e6027b65cbfc078bfae8a81249de49560464f163db5a4 WatchSource:0}: Error finding container f82feb457cf1233ccd8e6027b65cbfc078bfae8a81249de49560464f163db5a4: Status 404 returned error can't find the container with id f82feb457cf1233ccd8e6027b65cbfc078bfae8a81249de49560464f163db5a4 Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129335 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf"] Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.129793 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc1a917-fdce-46c7-93c7-506c00dcdeac" containerName="pruner" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129807 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc1a917-fdce-46c7-93c7-506c00dcdeac" containerName="pruner" Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.129824 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbde41b-950b-49d8-97eb-470ca9ebfdb8" containerName="pruner" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129830 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbde41b-950b-49d8-97eb-470ca9ebfdb8" containerName="pruner" Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.129839 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de9fe51-8926-4966-85f1-b14c16db8a74" containerName="collect-profiles" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129845 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de9fe51-8926-4966-85f1-b14c16db8a74" containerName="collect-profiles" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129952 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de9fe51-8926-4966-85f1-b14c16db8a74" containerName="collect-profiles" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129965 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc1a917-fdce-46c7-93c7-506c00dcdeac" containerName="pruner" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.129974 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbde41b-950b-49d8-97eb-470ca9ebfdb8" containerName="pruner" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.130286 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.132450 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.137093 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.140945 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf"] Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.203876 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2034397-268c-4190-8b7a-5c65db1eebb9-secret-volume\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.204040 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxb4\" (UniqueName: \"kubernetes.io/projected/f2034397-268c-4190-8b7a-5c65db1eebb9-kube-api-access-5wxb4\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.204167 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2034397-268c-4190-8b7a-5c65db1eebb9-config-volume\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.305360 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxb4\" (UniqueName: \"kubernetes.io/projected/f2034397-268c-4190-8b7a-5c65db1eebb9-kube-api-access-5wxb4\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.305465 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2034397-268c-4190-8b7a-5c65db1eebb9-config-volume\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.305553 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2034397-268c-4190-8b7a-5c65db1eebb9-secret-volume\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.306721 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2034397-268c-4190-8b7a-5c65db1eebb9-config-volume\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.314760 4626 generic.go:334] "Generic (PLEG): container finished" podID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerID="d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0" exitCode=0 Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.314852 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ps6z" event={"ID":"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91","Type":"ContainerDied","Data":"d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0"} Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.318627 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2034397-268c-4190-8b7a-5c65db1eebb9-secret-volume\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.319900 4626 generic.go:334] "Generic (PLEG): container finished" podID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerID="0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f" exitCode=0 Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.319965 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rbtx" event={"ID":"ad17db5e-dbea-4e58-94b2-6a897f475993","Type":"ContainerDied","Data":"0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f"} Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.328411 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxb4\" (UniqueName: \"kubernetes.io/projected/f2034397-268c-4190-8b7a-5c65db1eebb9-kube-api-access-5wxb4\") pod \"collect-profiles-29530485-lhsnf\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.328937 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" event={"ID":"53b6af64-b3dc-44ae-96bd-90ab1b79dc08","Type":"ContainerStarted","Data":"849643ec17489fb10339bdb3cbe62460be35465adc4740bcc0cd9beb0560d60d"} Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.328982 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" event={"ID":"53b6af64-b3dc-44ae-96bd-90ab1b79dc08","Type":"ContainerStarted","Data":"09f16cfe953fa09e409fd906f08cfd71823ddcb876899ec314da117934e0f617"} Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.329002 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ls5wf" event={"ID":"53b6af64-b3dc-44ae-96bd-90ab1b79dc08","Type":"ContainerStarted","Data":"f82feb457cf1233ccd8e6027b65cbfc078bfae8a81249de49560464f163db5a4"} Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.331306 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-t66n4" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.331399 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-swmdj" podUID="a1530c30-549e-4d85-b7ee-086832420311" Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.332268 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p8ddh" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.334038 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bn5h9" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" Feb 23 06:45:00 crc kubenswrapper[4626]: E0223 06:45:00.334160 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wvhss" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.433905 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ls5wf" podStartSLOduration=185.433827984 podStartE2EDuration="3m5.433827984s" podCreationTimestamp="2026-02-23 06:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:00.421140893 +0000 UTC m=+252.760470148" watchObservedRunningTime="2026-02-23 06:45:00.433827984 +0000 UTC m=+252.773157240" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.451008 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:00 crc kubenswrapper[4626]: I0223 06:45:00.851665 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf"] Feb 23 06:45:00 crc kubenswrapper[4626]: W0223 06:45:00.862292 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2034397_268c_4190_8b7a_5c65db1eebb9.slice/crio-dbfa089f4eb3df4cf295fc69439880a3659e65032713fad4384891f5d7f5aa7d WatchSource:0}: Error finding container dbfa089f4eb3df4cf295fc69439880a3659e65032713fad4384891f5d7f5aa7d: Status 404 returned error can't find the container with id dbfa089f4eb3df4cf295fc69439880a3659e65032713fad4384891f5d7f5aa7d Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.339432 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ps6z" event={"ID":"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91","Type":"ContainerStarted","Data":"77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118"} Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.344786 4626 generic.go:334] "Generic (PLEG): container finished" podID="f2034397-268c-4190-8b7a-5c65db1eebb9" containerID="5c567ddfbada4f38d8ed50afba763158b44c39b04642fcffd752121387fb4c06" exitCode=0 Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.344905 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" event={"ID":"f2034397-268c-4190-8b7a-5c65db1eebb9","Type":"ContainerDied","Data":"5c567ddfbada4f38d8ed50afba763158b44c39b04642fcffd752121387fb4c06"} Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.344949 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" event={"ID":"f2034397-268c-4190-8b7a-5c65db1eebb9","Type":"ContainerStarted","Data":"dbfa089f4eb3df4cf295fc69439880a3659e65032713fad4384891f5d7f5aa7d"} Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.347696 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rbtx" event={"ID":"ad17db5e-dbea-4e58-94b2-6a897f475993","Type":"ContainerStarted","Data":"03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b"} Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.367810 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9ps6z" podStartSLOduration=3.550848168 podStartE2EDuration="32.367798691s" podCreationTimestamp="2026-02-23 06:44:29 +0000 UTC" firstStartedPulling="2026-02-23 06:44:31.970959972 +0000 UTC m=+224.310289238" lastFinishedPulling="2026-02-23 06:45:00.787910494 +0000 UTC m=+253.127239761" observedRunningTime="2026-02-23 06:45:01.365008968 +0000 UTC m=+253.704338234" watchObservedRunningTime="2026-02-23 06:45:01.367798691 +0000 UTC m=+253.707127957" Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.381188 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6rbtx" podStartSLOduration=3.522190551 podStartE2EDuration="31.381177084s" podCreationTimestamp="2026-02-23 06:44:30 +0000 UTC" firstStartedPulling="2026-02-23 06:44:32.954226621 +0000 UTC m=+225.293555887" lastFinishedPulling="2026-02-23 06:45:00.813213163 +0000 UTC m=+253.152542420" observedRunningTime="2026-02-23 06:45:01.37780769 +0000 UTC m=+253.717136956" watchObservedRunningTime="2026-02-23 06:45:01.381177084 +0000 UTC m=+253.720506350" Feb 23 06:45:01 crc kubenswrapper[4626]: I0223 06:45:01.388513 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6rz8s" Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.588243 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9697bfb97-sfbbg"] Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.589063 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" podUID="870cbdb2-daed-4802-a6fb-77d817f68d07" containerName="controller-manager" containerID="cri-o://40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011" gracePeriod=30 Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.685296 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg"] Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.685564 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" podUID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" containerName="route-controller-manager" containerID="cri-o://f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4" gracePeriod=30 Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.827179 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.956566 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2034397-268c-4190-8b7a-5c65db1eebb9-secret-volume\") pod \"f2034397-268c-4190-8b7a-5c65db1eebb9\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.956719 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxb4\" (UniqueName: \"kubernetes.io/projected/f2034397-268c-4190-8b7a-5c65db1eebb9-kube-api-access-5wxb4\") pod \"f2034397-268c-4190-8b7a-5c65db1eebb9\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.956755 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2034397-268c-4190-8b7a-5c65db1eebb9-config-volume\") pod \"f2034397-268c-4190-8b7a-5c65db1eebb9\" (UID: \"f2034397-268c-4190-8b7a-5c65db1eebb9\") " Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.957478 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2034397-268c-4190-8b7a-5c65db1eebb9-config-volume" (OuterVolumeSpecName: "config-volume") pod "f2034397-268c-4190-8b7a-5c65db1eebb9" (UID: "f2034397-268c-4190-8b7a-5c65db1eebb9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.980114 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2034397-268c-4190-8b7a-5c65db1eebb9-kube-api-access-5wxb4" (OuterVolumeSpecName: "kube-api-access-5wxb4") pod "f2034397-268c-4190-8b7a-5c65db1eebb9" (UID: "f2034397-268c-4190-8b7a-5c65db1eebb9"). InnerVolumeSpecName "kube-api-access-5wxb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:02 crc kubenswrapper[4626]: I0223 06:45:02.980654 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2034397-268c-4190-8b7a-5c65db1eebb9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f2034397-268c-4190-8b7a-5c65db1eebb9" (UID: "f2034397-268c-4190-8b7a-5c65db1eebb9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.057999 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxb4\" (UniqueName: \"kubernetes.io/projected/f2034397-268c-4190-8b7a-5c65db1eebb9-kube-api-access-5wxb4\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.058027 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f2034397-268c-4190-8b7a-5c65db1eebb9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.058041 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f2034397-268c-4190-8b7a-5c65db1eebb9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.067104 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.123297 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.131315 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.159384 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-config\") pod \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.159454 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvtxf\" (UniqueName: \"kubernetes.io/projected/f329f420-1c05-4b54-8de7-7f03d9cd78e4-kube-api-access-wvtxf\") pod \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.159551 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-client-ca\") pod \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.159603 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f329f420-1c05-4b54-8de7-7f03d9cd78e4-serving-cert\") pod \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\" (UID: \"f329f420-1c05-4b54-8de7-7f03d9cd78e4\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.160400 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-client-ca" (OuterVolumeSpecName: "client-ca") pod "f329f420-1c05-4b54-8de7-7f03d9cd78e4" (UID: "f329f420-1c05-4b54-8de7-7f03d9cd78e4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.160428 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-config" (OuterVolumeSpecName: "config") pod "f329f420-1c05-4b54-8de7-7f03d9cd78e4" (UID: "f329f420-1c05-4b54-8de7-7f03d9cd78e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.164609 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f329f420-1c05-4b54-8de7-7f03d9cd78e4-kube-api-access-wvtxf" (OuterVolumeSpecName: "kube-api-access-wvtxf") pod "f329f420-1c05-4b54-8de7-7f03d9cd78e4" (UID: "f329f420-1c05-4b54-8de7-7f03d9cd78e4"). InnerVolumeSpecName "kube-api-access-wvtxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.164617 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f329f420-1c05-4b54-8de7-7f03d9cd78e4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f329f420-1c05-4b54-8de7-7f03d9cd78e4" (UID: "f329f420-1c05-4b54-8de7-7f03d9cd78e4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.260418 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-proxy-ca-bundles\") pod \"870cbdb2-daed-4802-a6fb-77d817f68d07\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.260472 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-client-ca\") pod \"870cbdb2-daed-4802-a6fb-77d817f68d07\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.260513 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq5lq\" (UniqueName: \"kubernetes.io/projected/870cbdb2-daed-4802-a6fb-77d817f68d07-kube-api-access-jq5lq\") pod \"870cbdb2-daed-4802-a6fb-77d817f68d07\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.260539 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870cbdb2-daed-4802-a6fb-77d817f68d07-serving-cert\") pod \"870cbdb2-daed-4802-a6fb-77d817f68d07\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.260674 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-config\") pod \"870cbdb2-daed-4802-a6fb-77d817f68d07\" (UID: \"870cbdb2-daed-4802-a6fb-77d817f68d07\") " Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.261004 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.261023 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvtxf\" (UniqueName: \"kubernetes.io/projected/f329f420-1c05-4b54-8de7-7f03d9cd78e4-kube-api-access-wvtxf\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.261034 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f329f420-1c05-4b54-8de7-7f03d9cd78e4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.261044 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f329f420-1c05-4b54-8de7-7f03d9cd78e4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.261467 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-client-ca" (OuterVolumeSpecName: "client-ca") pod "870cbdb2-daed-4802-a6fb-77d817f68d07" (UID: "870cbdb2-daed-4802-a6fb-77d817f68d07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.261909 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "870cbdb2-daed-4802-a6fb-77d817f68d07" (UID: "870cbdb2-daed-4802-a6fb-77d817f68d07"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.262240 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-config" (OuterVolumeSpecName: "config") pod "870cbdb2-daed-4802-a6fb-77d817f68d07" (UID: "870cbdb2-daed-4802-a6fb-77d817f68d07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.264550 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870cbdb2-daed-4802-a6fb-77d817f68d07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "870cbdb2-daed-4802-a6fb-77d817f68d07" (UID: "870cbdb2-daed-4802-a6fb-77d817f68d07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.264761 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870cbdb2-daed-4802-a6fb-77d817f68d07-kube-api-access-jq5lq" (OuterVolumeSpecName: "kube-api-access-jq5lq") pod "870cbdb2-daed-4802-a6fb-77d817f68d07" (UID: "870cbdb2-daed-4802-a6fb-77d817f68d07"). InnerVolumeSpecName "kube-api-access-jq5lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.362411 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.362454 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.362467 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870cbdb2-daed-4802-a6fb-77d817f68d07-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.362480 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq5lq\" (UniqueName: \"kubernetes.io/projected/870cbdb2-daed-4802-a6fb-77d817f68d07-kube-api-access-jq5lq\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.362491 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870cbdb2-daed-4802-a6fb-77d817f68d07-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.364441 4626 generic.go:334] "Generic (PLEG): container finished" podID="870cbdb2-daed-4802-a6fb-77d817f68d07" containerID="40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011" exitCode=0 Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.364558 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.364548 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" event={"ID":"870cbdb2-daed-4802-a6fb-77d817f68d07","Type":"ContainerDied","Data":"40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011"} Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.364788 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9697bfb97-sfbbg" event={"ID":"870cbdb2-daed-4802-a6fb-77d817f68d07","Type":"ContainerDied","Data":"36b94649b8cc3a33f308f3a1b072908f2331fea283a3d04c46759640380a7c43"} Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.364899 4626 scope.go:117] "RemoveContainer" containerID="40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.366454 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" event={"ID":"f2034397-268c-4190-8b7a-5c65db1eebb9","Type":"ContainerDied","Data":"dbfa089f4eb3df4cf295fc69439880a3659e65032713fad4384891f5d7f5aa7d"} Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.366605 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbfa089f4eb3df4cf295fc69439880a3659e65032713fad4384891f5d7f5aa7d" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.366518 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.371755 4626 generic.go:334] "Generic (PLEG): container finished" podID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" containerID="f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4" exitCode=0 Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.371802 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" event={"ID":"f329f420-1c05-4b54-8de7-7f03d9cd78e4","Type":"ContainerDied","Data":"f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4"} Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.371833 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" event={"ID":"f329f420-1c05-4b54-8de7-7f03d9cd78e4","Type":"ContainerDied","Data":"a79048a229894f6bac8e16afb9662e5bf47c55892401cf01bc88eb8154e0cf57"} Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.371884 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.391261 4626 scope.go:117] "RemoveContainer" containerID="40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011" Feb 23 06:45:03 crc kubenswrapper[4626]: E0223 06:45:03.392747 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011\": container with ID starting with 40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011 not found: ID does not exist" containerID="40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.392789 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011"} err="failed to get container status \"40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011\": rpc error: code = NotFound desc = could not find container \"40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011\": container with ID starting with 40274e293423d0f2a7bec4b28211c24b1f38a9efbec2d8c67f57a35501d06011 not found: ID does not exist" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.392814 4626 scope.go:117] "RemoveContainer" containerID="f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.396422 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9697bfb97-sfbbg"] Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.414079 4626 scope.go:117] "RemoveContainer" containerID="f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4" Feb 23 06:45:03 crc kubenswrapper[4626]: E0223 06:45:03.414485 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4\": container with ID starting with f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4 not found: ID does not exist" containerID="f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.414538 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4"} err="failed to get container status \"f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4\": rpc error: code = NotFound desc = could not find container \"f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4\": container with ID starting with f9f7484255359ce760967f142ae548ff8c1256e14ca5ae6330706828f0d884f4 not found: ID does not exist" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.414765 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9697bfb97-sfbbg"] Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.421660 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg"] Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.421708 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8f7958cb-fmdvg"] Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.990170 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870cbdb2-daed-4802-a6fb-77d817f68d07" path="/var/lib/kubelet/pods/870cbdb2-daed-4802-a6fb-77d817f68d07/volumes" Feb 23 06:45:03 crc kubenswrapper[4626]: I0223 06:45:03.991311 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" path="/var/lib/kubelet/pods/f329f420-1c05-4b54-8de7-7f03d9cd78e4/volumes" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443024 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd"] Feb 23 06:45:04 crc kubenswrapper[4626]: E0223 06:45:04.443388 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2034397-268c-4190-8b7a-5c65db1eebb9" containerName="collect-profiles" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443411 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2034397-268c-4190-8b7a-5c65db1eebb9" containerName="collect-profiles" Feb 23 06:45:04 crc kubenswrapper[4626]: E0223 06:45:04.443425 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870cbdb2-daed-4802-a6fb-77d817f68d07" containerName="controller-manager" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443431 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="870cbdb2-daed-4802-a6fb-77d817f68d07" containerName="controller-manager" Feb 23 06:45:04 crc kubenswrapper[4626]: E0223 06:45:04.443464 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" containerName="route-controller-manager" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443470 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" containerName="route-controller-manager" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443605 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2034397-268c-4190-8b7a-5c65db1eebb9" containerName="collect-profiles" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443625 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f329f420-1c05-4b54-8de7-7f03d9cd78e4" containerName="route-controller-manager" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.443633 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="870cbdb2-daed-4802-a6fb-77d817f68d07" containerName="controller-manager" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.444164 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.446924 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc"] Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.447673 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.447705 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.447901 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.448531 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.449812 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.449844 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.450277 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.452206 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.454130 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.454442 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.454751 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.455039 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd"] Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.455147 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.455200 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.460598 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.465629 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc"] Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580230 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-client-ca\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580295 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz6lj\" (UniqueName: \"kubernetes.io/projected/ae78e29d-6477-47e3-9ec1-1ebd651a5414-kube-api-access-lz6lj\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580396 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzcl\" (UniqueName: \"kubernetes.io/projected/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-kube-api-access-ktzcl\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580449 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-serving-cert\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580524 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-client-ca\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580587 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-config\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580614 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-proxy-ca-bundles\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580759 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-config\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.580815 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78e29d-6477-47e3-9ec1-1ebd651a5414-serving-cert\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682690 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-config\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682731 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78e29d-6477-47e3-9ec1-1ebd651a5414-serving-cert\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682764 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-client-ca\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682823 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz6lj\" (UniqueName: \"kubernetes.io/projected/ae78e29d-6477-47e3-9ec1-1ebd651a5414-kube-api-access-lz6lj\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682856 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzcl\" (UniqueName: \"kubernetes.io/projected/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-kube-api-access-ktzcl\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682889 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-serving-cert\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682911 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-client-ca\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-config\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.682974 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-proxy-ca-bundles\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.684060 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-config\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.684512 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-proxy-ca-bundles\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.685254 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-client-ca\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.685663 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-client-ca\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.686381 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-config\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.695248 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78e29d-6477-47e3-9ec1-1ebd651a5414-serving-cert\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.695335 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-serving-cert\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.702295 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzcl\" (UniqueName: \"kubernetes.io/projected/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-kube-api-access-ktzcl\") pod \"route-controller-manager-57d7cb74c5-tznlc\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.702873 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz6lj\" (UniqueName: \"kubernetes.io/projected/ae78e29d-6477-47e3-9ec1-1ebd651a5414-kube-api-access-lz6lj\") pod \"controller-manager-575f6dcb4d-6s8wd\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.761093 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:04 crc kubenswrapper[4626]: I0223 06:45:04.768815 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.197937 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc"] Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.232579 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd"] Feb 23 06:45:05 crc kubenswrapper[4626]: W0223 06:45:05.243726 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae78e29d_6477_47e3_9ec1_1ebd651a5414.slice/crio-78aaa275537327bd3c2b884af720135bcbd9e71c6793113df312a1bf376b2b45 WatchSource:0}: Error finding container 78aaa275537327bd3c2b884af720135bcbd9e71c6793113df312a1bf376b2b45: Status 404 returned error can't find the container with id 78aaa275537327bd3c2b884af720135bcbd9e71c6793113df312a1bf376b2b45 Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.386640 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" event={"ID":"ae78e29d-6477-47e3-9ec1-1ebd651a5414","Type":"ContainerStarted","Data":"fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17"} Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.386717 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.386731 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" event={"ID":"ae78e29d-6477-47e3-9ec1-1ebd651a5414","Type":"ContainerStarted","Data":"78aaa275537327bd3c2b884af720135bcbd9e71c6793113df312a1bf376b2b45"} Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.389156 4626 patch_prober.go:28] interesting pod/controller-manager-575f6dcb4d-6s8wd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.389191 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" event={"ID":"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b","Type":"ContainerStarted","Data":"02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f"} Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.389234 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" podUID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.389252 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" event={"ID":"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b","Type":"ContainerStarted","Data":"6bad8fcd37e38e39c17059c9453757ee0d78062ae7cf1c3a369f91e0f6df7f9b"} Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.389472 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.390943 4626 patch_prober.go:28] interesting pod/route-controller-manager-57d7cb74c5-tznlc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.391009 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" podUID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Feb 23 06:45:05 crc kubenswrapper[4626]: I0223 06:45:05.406755 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" podStartSLOduration=3.406739715 podStartE2EDuration="3.406739715s" podCreationTimestamp="2026-02-23 06:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:05.404103781 +0000 UTC m=+257.743433047" watchObservedRunningTime="2026-02-23 06:45:05.406739715 +0000 UTC m=+257.746068981" Feb 23 06:45:06 crc kubenswrapper[4626]: I0223 06:45:06.401615 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:06 crc kubenswrapper[4626]: I0223 06:45:06.401898 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:06 crc kubenswrapper[4626]: I0223 06:45:06.427951 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" podStartSLOduration=4.427927185 podStartE2EDuration="4.427927185s" podCreationTimestamp="2026-02-23 06:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:05.42810566 +0000 UTC m=+257.767434926" watchObservedRunningTime="2026-02-23 06:45:06.427927185 +0000 UTC m=+258.767256451" Feb 23 06:45:07 crc kubenswrapper[4626]: I0223 06:45:07.918955 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:45:07 crc kubenswrapper[4626]: I0223 06:45:07.921064 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:07 crc kubenswrapper[4626]: I0223 06:45:07.933899 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:45:07 crc kubenswrapper[4626]: I0223 06:45:07.934088 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:45:07 crc kubenswrapper[4626]: I0223 06:45:07.942190 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.030733 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.030792 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.132349 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.132392 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.132682 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.151041 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.236985 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:08 crc kubenswrapper[4626]: I0223 06:45:08.640160 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:45:08 crc kubenswrapper[4626]: W0223 06:45:08.653182 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0a718e5a_8c5e_4277_8b6e_0cdf66d3e1aa.slice/crio-5dfbd288e70b3a2f358b872187ab69f4e28650b53257b1320508632650eda645 WatchSource:0}: Error finding container 5dfbd288e70b3a2f358b872187ab69f4e28650b53257b1320508632650eda645: Status 404 returned error can't find the container with id 5dfbd288e70b3a2f358b872187ab69f4e28650b53257b1320508632650eda645 Feb 23 06:45:09 crc kubenswrapper[4626]: I0223 06:45:09.416473 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa","Type":"ContainerStarted","Data":"7ce6ae9a95cee4881151629fa042591cb2c5b58d9b1c429df300e499407e3dd0"} Feb 23 06:45:09 crc kubenswrapper[4626]: I0223 06:45:09.417582 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa","Type":"ContainerStarted","Data":"5dfbd288e70b3a2f358b872187ab69f4e28650b53257b1320508632650eda645"} Feb 23 06:45:09 crc kubenswrapper[4626]: I0223 06:45:09.442587 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.4425674490000002 podStartE2EDuration="2.442567449s" podCreationTimestamp="2026-02-23 06:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:09.440620735 +0000 UTC m=+261.779950001" watchObservedRunningTime="2026-02-23 06:45:09.442567449 +0000 UTC m=+261.781896715" Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.211362 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.211469 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.375398 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.421881 4626 generic.go:334] "Generic (PLEG): container finished" podID="0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa" containerID="7ce6ae9a95cee4881151629fa042591cb2c5b58d9b1c429df300e499407e3dd0" exitCode=0 Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.422396 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa","Type":"ContainerDied","Data":"7ce6ae9a95cee4881151629fa042591cb2c5b58d9b1c429df300e499407e3dd0"} Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.453419 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:45:10 crc kubenswrapper[4626]: I0223 06:45:10.605202 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ps6z"] Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.021556 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.021889 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.055954 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.459459 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.691045 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.882609 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kube-api-access\") pod \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.882721 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kubelet-dir\") pod \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\" (UID: \"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa\") " Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.882915 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa" (UID: "0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.887649 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa" (UID: "0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.984947 4626 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:11 crc kubenswrapper[4626]: I0223 06:45:11.985279 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.436713 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa","Type":"ContainerDied","Data":"5dfbd288e70b3a2f358b872187ab69f4e28650b53257b1320508632650eda645"} Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.436771 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dfbd288e70b3a2f358b872187ab69f4e28650b53257b1320508632650eda645" Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.437024 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.437895 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9ps6z" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="registry-server" containerID="cri-o://77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118" gracePeriod=2 Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.835632 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.999686 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-catalog-content\") pod \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " Feb 23 06:45:12 crc kubenswrapper[4626]: I0223 06:45:12.999836 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-utilities\") pod \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:12.999978 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b8sm\" (UniqueName: \"kubernetes.io/projected/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-kube-api-access-7b8sm\") pod \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\" (UID: \"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91\") " Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.000560 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-utilities" (OuterVolumeSpecName: "utilities") pod "b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" (UID: "b157932b-4d8e-46fd-8e45-f7e9a8f0cd91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.020739 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" (UID: "b157932b-4d8e-46fd-8e45-f7e9a8f0cd91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.025655 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-kube-api-access-7b8sm" (OuterVolumeSpecName: "kube-api-access-7b8sm") pod "b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" (UID: "b157932b-4d8e-46fd-8e45-f7e9a8f0cd91"). InnerVolumeSpecName "kube-api-access-7b8sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.102274 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.102321 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.102333 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b8sm\" (UniqueName: \"kubernetes.io/projected/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91-kube-api-access-7b8sm\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.445591 4626 generic.go:334] "Generic (PLEG): container finished" podID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerID="57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8" exitCode=0 Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.445656 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb2fx" event={"ID":"b3028805-3229-4cc3-9e20-2bca252b2c19","Type":"ContainerDied","Data":"57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8"} Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.449234 4626 generic.go:334] "Generic (PLEG): container finished" podID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerID="747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c" exitCode=0 Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.449296 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t66n4" event={"ID":"1f7808e4-0cf2-47e0-aa06-62282f4111db","Type":"ContainerDied","Data":"747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c"} Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.454143 4626 generic.go:334] "Generic (PLEG): container finished" podID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerID="77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118" exitCode=0 Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.454578 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ps6z" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.455053 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ps6z" event={"ID":"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91","Type":"ContainerDied","Data":"77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118"} Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.455091 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ps6z" event={"ID":"b157932b-4d8e-46fd-8e45-f7e9a8f0cd91","Type":"ContainerDied","Data":"c6a0a43be667556bbaac123f691b2b9806c3ac9fce9edf0dc05732dff5752eae"} Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.455112 4626 scope.go:117] "RemoveContainer" containerID="77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.473013 4626 scope.go:117] "RemoveContainer" containerID="d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.492925 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ps6z"] Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.498036 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ps6z"] Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.500449 4626 scope.go:117] "RemoveContainer" containerID="9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.512205 4626 scope.go:117] "RemoveContainer" containerID="77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118" Feb 23 06:45:13 crc kubenswrapper[4626]: E0223 06:45:13.512490 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118\": container with ID starting with 77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118 not found: ID does not exist" containerID="77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.512626 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118"} err="failed to get container status \"77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118\": rpc error: code = NotFound desc = could not find container \"77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118\": container with ID starting with 77b1d373cd4aa52752bf13959013cedb8b87218da7723da0a4c414b35721f118 not found: ID does not exist" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.512649 4626 scope.go:117] "RemoveContainer" containerID="d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0" Feb 23 06:45:13 crc kubenswrapper[4626]: E0223 06:45:13.513356 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0\": container with ID starting with d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0 not found: ID does not exist" containerID="d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.513394 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0"} err="failed to get container status \"d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0\": rpc error: code = NotFound desc = could not find container \"d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0\": container with ID starting with d1820f11ec2b2d50399514c490e23247deda1ba7ac1c70b9abeae0b0f7d1a2e0 not found: ID does not exist" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.513416 4626 scope.go:117] "RemoveContainer" containerID="9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e" Feb 23 06:45:13 crc kubenswrapper[4626]: E0223 06:45:13.513765 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e\": container with ID starting with 9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e not found: ID does not exist" containerID="9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.513790 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e"} err="failed to get container status \"9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e\": rpc error: code = NotFound desc = could not find container \"9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e\": container with ID starting with 9767fb1505c908ee353829d1e5c79ba98feaf2170d4e5ed75d350d2107c6334e not found: ID does not exist" Feb 23 06:45:13 crc kubenswrapper[4626]: I0223 06:45:13.997092 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" path="/var/lib/kubelet/pods/b157932b-4d8e-46fd-8e45-f7e9a8f0cd91/volumes" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.460474 4626 generic.go:334] "Generic (PLEG): container finished" podID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerID="80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893" exitCode=0 Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.460795 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvhss" event={"ID":"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5","Type":"ContainerDied","Data":"80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893"} Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.463571 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerStarted","Data":"4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e"} Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.466182 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb2fx" event={"ID":"b3028805-3229-4cc3-9e20-2bca252b2c19","Type":"ContainerStarted","Data":"b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9"} Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.468174 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t66n4" event={"ID":"1f7808e4-0cf2-47e0-aa06-62282f4111db","Type":"ContainerStarted","Data":"407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354"} Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.512919 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wb2fx" podStartSLOduration=3.351736689 podStartE2EDuration="47.512901371s" podCreationTimestamp="2026-02-23 06:44:27 +0000 UTC" firstStartedPulling="2026-02-23 06:44:29.747208534 +0000 UTC m=+222.086537799" lastFinishedPulling="2026-02-23 06:45:13.908373215 +0000 UTC m=+266.247702481" observedRunningTime="2026-02-23 06:45:14.509690577 +0000 UTC m=+266.849019843" watchObservedRunningTime="2026-02-23 06:45:14.512901371 +0000 UTC m=+266.852230628" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.531524 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t66n4" podStartSLOduration=3.3435951360000002 podStartE2EDuration="47.531507423s" podCreationTimestamp="2026-02-23 06:44:27 +0000 UTC" firstStartedPulling="2026-02-23 06:44:29.748806572 +0000 UTC m=+222.088135838" lastFinishedPulling="2026-02-23 06:45:13.93671886 +0000 UTC m=+266.276048125" observedRunningTime="2026-02-23 06:45:14.527014115 +0000 UTC m=+266.866343381" watchObservedRunningTime="2026-02-23 06:45:14.531507423 +0000 UTC m=+266.870836689" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.918652 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:45:14 crc kubenswrapper[4626]: E0223 06:45:14.918887 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="extract-content" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.918905 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="extract-content" Feb 23 06:45:14 crc kubenswrapper[4626]: E0223 06:45:14.918916 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="registry-server" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.918921 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="registry-server" Feb 23 06:45:14 crc kubenswrapper[4626]: E0223 06:45:14.918934 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa" containerName="pruner" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.918942 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa" containerName="pruner" Feb 23 06:45:14 crc kubenswrapper[4626]: E0223 06:45:14.918948 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="extract-utilities" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.918954 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="extract-utilities" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.919075 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a718e5a-8c5e-4277-8b6e-0cdf66d3e1aa" containerName="pruner" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.919098 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b157932b-4d8e-46fd-8e45-f7e9a8f0cd91" containerName="registry-server" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.919465 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.922281 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.922611 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:45:14 crc kubenswrapper[4626]: I0223 06:45:14.951206 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.043215 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.043328 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e121b029-d37d-4211-84f6-348ffc0a1686-kube-api-access\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.043391 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-var-lock\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.144936 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.145035 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e121b029-d37d-4211-84f6-348ffc0a1686-kube-api-access\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.145071 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-var-lock\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.145082 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.145316 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-var-lock\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.173167 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e121b029-d37d-4211-84f6-348ffc0a1686-kube-api-access\") pod \"installer-9-crc\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.252315 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.480584 4626 generic.go:334] "Generic (PLEG): container finished" podID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerID="fe3c7a8d118d8f8e1b21878c0771b346817fb85b37b47f39c2e9c94204172ce5" exitCode=0 Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.480656 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8ddh" event={"ID":"850794ce-0d65-45b4-81be-15dd522ab8fc","Type":"ContainerDied","Data":"fe3c7a8d118d8f8e1b21878c0771b346817fb85b37b47f39c2e9c94204172ce5"} Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.484647 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvhss" event={"ID":"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5","Type":"ContainerStarted","Data":"42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372"} Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.490529 4626 generic.go:334] "Generic (PLEG): container finished" podID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerID="4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e" exitCode=0 Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.490593 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerDied","Data":"4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e"} Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.550561 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wvhss" podStartSLOduration=2.364915996 podStartE2EDuration="46.550544236s" podCreationTimestamp="2026-02-23 06:44:29 +0000 UTC" firstStartedPulling="2026-02-23 06:44:30.848891218 +0000 UTC m=+223.188220484" lastFinishedPulling="2026-02-23 06:45:15.034519468 +0000 UTC m=+267.373848724" observedRunningTime="2026-02-23 06:45:15.529029791 +0000 UTC m=+267.868359057" watchObservedRunningTime="2026-02-23 06:45:15.550544236 +0000 UTC m=+267.889873492" Feb 23 06:45:15 crc kubenswrapper[4626]: I0223 06:45:15.649418 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:45:15 crc kubenswrapper[4626]: W0223 06:45:15.658805 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode121b029_d37d_4211_84f6_348ffc0a1686.slice/crio-2b206815a72e4bb24a49e369c1a6558707c0be86c764e99a0e305c4ae5ac4191 WatchSource:0}: Error finding container 2b206815a72e4bb24a49e369c1a6558707c0be86c764e99a0e305c4ae5ac4191: Status 404 returned error can't find the container with id 2b206815a72e4bb24a49e369c1a6558707c0be86c764e99a0e305c4ae5ac4191 Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.500905 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8ddh" event={"ID":"850794ce-0d65-45b4-81be-15dd522ab8fc","Type":"ContainerStarted","Data":"65221602ee94cbd641035d68af794c5db14da3277e2fa4ebeacabc3e57fd3e33"} Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.503835 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e121b029-d37d-4211-84f6-348ffc0a1686","Type":"ContainerStarted","Data":"7b967edc24977c69759a771bfbf57c4f995faecb37b7a5e9b7329fd561ba38ed"} Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.503897 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e121b029-d37d-4211-84f6-348ffc0a1686","Type":"ContainerStarted","Data":"2b206815a72e4bb24a49e369c1a6558707c0be86c764e99a0e305c4ae5ac4191"} Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.507675 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerStarted","Data":"e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf"} Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.510097 4626 generic.go:334] "Generic (PLEG): container finished" podID="a1530c30-549e-4d85-b7ee-086832420311" containerID="40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86" exitCode=0 Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.510136 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swmdj" event={"ID":"a1530c30-549e-4d85-b7ee-086832420311","Type":"ContainerDied","Data":"40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86"} Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.557517 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p8ddh" podStartSLOduration=2.352385086 podStartE2EDuration="48.557473871s" podCreationTimestamp="2026-02-23 06:44:28 +0000 UTC" firstStartedPulling="2026-02-23 06:44:29.743332704 +0000 UTC m=+222.082661970" lastFinishedPulling="2026-02-23 06:45:15.948421489 +0000 UTC m=+268.287750755" observedRunningTime="2026-02-23 06:45:16.539703161 +0000 UTC m=+268.879032427" watchObservedRunningTime="2026-02-23 06:45:16.557473871 +0000 UTC m=+268.896803137" Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.558638 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bn5h9" podStartSLOduration=2.5570781240000002 podStartE2EDuration="45.558624486s" podCreationTimestamp="2026-02-23 06:44:31 +0000 UTC" firstStartedPulling="2026-02-23 06:44:32.968450636 +0000 UTC m=+225.307779903" lastFinishedPulling="2026-02-23 06:45:15.969997 +0000 UTC m=+268.309326265" observedRunningTime="2026-02-23 06:45:16.556163915 +0000 UTC m=+268.895493181" watchObservedRunningTime="2026-02-23 06:45:16.558624486 +0000 UTC m=+268.897953752" Feb 23 06:45:16 crc kubenswrapper[4626]: I0223 06:45:16.594356 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.59434437 podStartE2EDuration="2.59434437s" podCreationTimestamp="2026-02-23 06:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:16.590043295 +0000 UTC m=+268.929372560" watchObservedRunningTime="2026-02-23 06:45:16.59434437 +0000 UTC m=+268.933673637" Feb 23 06:45:17 crc kubenswrapper[4626]: I0223 06:45:17.519235 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swmdj" event={"ID":"a1530c30-549e-4d85-b7ee-086832420311","Type":"ContainerStarted","Data":"c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0"} Feb 23 06:45:17 crc kubenswrapper[4626]: I0223 06:45:17.542099 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swmdj" podStartSLOduration=3.326340409 podStartE2EDuration="50.542081709s" podCreationTimestamp="2026-02-23 06:44:27 +0000 UTC" firstStartedPulling="2026-02-23 06:44:29.762630876 +0000 UTC m=+222.101960142" lastFinishedPulling="2026-02-23 06:45:16.978372175 +0000 UTC m=+269.317701442" observedRunningTime="2026-02-23 06:45:17.538654388 +0000 UTC m=+269.877983654" watchObservedRunningTime="2026-02-23 06:45:17.542081709 +0000 UTC m=+269.881410976" Feb 23 06:45:17 crc kubenswrapper[4626]: I0223 06:45:17.808888 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:45:17 crc kubenswrapper[4626]: I0223 06:45:17.808959 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:45:17 crc kubenswrapper[4626]: I0223 06:45:17.851254 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.047963 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.048030 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.217853 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.217909 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.255740 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.423587 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.423650 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:45:18 crc kubenswrapper[4626]: I0223 06:45:18.456675 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:45:19 crc kubenswrapper[4626]: I0223 06:45:19.080540 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-swmdj" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="registry-server" probeResult="failure" output=< Feb 23 06:45:19 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 06:45:19 crc kubenswrapper[4626]: > Feb 23 06:45:19 crc kubenswrapper[4626]: I0223 06:45:19.849188 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:45:19 crc kubenswrapper[4626]: I0223 06:45:19.849249 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:45:19 crc kubenswrapper[4626]: I0223 06:45:19.895315 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:45:20 crc kubenswrapper[4626]: I0223 06:45:20.584302 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:45:21 crc kubenswrapper[4626]: I0223 06:45:21.428798 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:45:21 crc kubenswrapper[4626]: I0223 06:45:21.428855 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:45:21 crc kubenswrapper[4626]: I0223 06:45:21.457566 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:45:21 crc kubenswrapper[4626]: I0223 06:45:21.594890 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:45:22 crc kubenswrapper[4626]: I0223 06:45:22.550712 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd"] Feb 23 06:45:22 crc kubenswrapper[4626]: I0223 06:45:22.551016 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" podUID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" containerName="controller-manager" containerID="cri-o://fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17" gracePeriod=30 Feb 23 06:45:22 crc kubenswrapper[4626]: I0223 06:45:22.570477 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc"] Feb 23 06:45:22 crc kubenswrapper[4626]: I0223 06:45:22.570689 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" podUID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" containerName="route-controller-manager" containerID="cri-o://02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f" gracePeriod=30 Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.047829 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.054366 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151021 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-client-ca\") pod \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151068 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzcl\" (UniqueName: \"kubernetes.io/projected/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-kube-api-access-ktzcl\") pod \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151116 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-config\") pod \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151156 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-serving-cert\") pod \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151199 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-client-ca\") pod \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151247 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78e29d-6477-47e3-9ec1-1ebd651a5414-serving-cert\") pod \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151281 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz6lj\" (UniqueName: \"kubernetes.io/projected/ae78e29d-6477-47e3-9ec1-1ebd651a5414-kube-api-access-lz6lj\") pod \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151339 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-config\") pod \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\" (UID: \"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.151355 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-proxy-ca-bundles\") pod \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\" (UID: \"ae78e29d-6477-47e3-9ec1-1ebd651a5414\") " Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.152293 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" (UID: "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.152563 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-config" (OuterVolumeSpecName: "config") pod "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" (UID: "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.152780 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-config" (OuterVolumeSpecName: "config") pod "ae78e29d-6477-47e3-9ec1-1ebd651a5414" (UID: "ae78e29d-6477-47e3-9ec1-1ebd651a5414"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.152946 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae78e29d-6477-47e3-9ec1-1ebd651a5414" (UID: "ae78e29d-6477-47e3-9ec1-1ebd651a5414"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.153052 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae78e29d-6477-47e3-9ec1-1ebd651a5414" (UID: "ae78e29d-6477-47e3-9ec1-1ebd651a5414"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.153192 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.156431 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae78e29d-6477-47e3-9ec1-1ebd651a5414-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae78e29d-6477-47e3-9ec1-1ebd651a5414" (UID: "ae78e29d-6477-47e3-9ec1-1ebd651a5414"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.156445 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae78e29d-6477-47e3-9ec1-1ebd651a5414-kube-api-access-lz6lj" (OuterVolumeSpecName: "kube-api-access-lz6lj") pod "ae78e29d-6477-47e3-9ec1-1ebd651a5414" (UID: "ae78e29d-6477-47e3-9ec1-1ebd651a5414"). InnerVolumeSpecName "kube-api-access-lz6lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.156428 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-kube-api-access-ktzcl" (OuterVolumeSpecName: "kube-api-access-ktzcl") pod "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" (UID: "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b"). InnerVolumeSpecName "kube-api-access-ktzcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.157136 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" (UID: "7bcab753-1eb4-476b-bfd2-6d5f7da07f8b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254155 4626 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254188 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae78e29d-6477-47e3-9ec1-1ebd651a5414-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254200 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz6lj\" (UniqueName: \"kubernetes.io/projected/ae78e29d-6477-47e3-9ec1-1ebd651a5414-kube-api-access-lz6lj\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254218 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254231 4626 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254242 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktzcl\" (UniqueName: \"kubernetes.io/projected/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-kube-api-access-ktzcl\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254252 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae78e29d-6477-47e3-9ec1-1ebd651a5414-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.254260 4626 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.572570 4626 generic.go:334] "Generic (PLEG): container finished" podID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" containerID="fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17" exitCode=0 Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.572623 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" event={"ID":"ae78e29d-6477-47e3-9ec1-1ebd651a5414","Type":"ContainerDied","Data":"fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17"} Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.572681 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" event={"ID":"ae78e29d-6477-47e3-9ec1-1ebd651a5414","Type":"ContainerDied","Data":"78aaa275537327bd3c2b884af720135bcbd9e71c6793113df312a1bf376b2b45"} Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.572703 4626 scope.go:117] "RemoveContainer" containerID="fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.572984 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.574403 4626 generic.go:334] "Generic (PLEG): container finished" podID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" containerID="02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f" exitCode=0 Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.574461 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" event={"ID":"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b","Type":"ContainerDied","Data":"02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f"} Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.574551 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" event={"ID":"7bcab753-1eb4-476b-bfd2-6d5f7da07f8b","Type":"ContainerDied","Data":"6bad8fcd37e38e39c17059c9453757ee0d78062ae7cf1c3a369f91e0f6df7f9b"} Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.574634 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.603974 4626 scope.go:117] "RemoveContainer" containerID="fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17" Feb 23 06:45:23 crc kubenswrapper[4626]: E0223 06:45:23.607604 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17\": container with ID starting with fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17 not found: ID does not exist" containerID="fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.607668 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17"} err="failed to get container status \"fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17\": rpc error: code = NotFound desc = could not find container \"fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17\": container with ID starting with fd4f3480d2a4b2638e3e086047304036130a13eb210725d6586ce9be9633aa17 not found: ID does not exist" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.607718 4626 scope.go:117] "RemoveContainer" containerID="02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.620179 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc"] Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.629123 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d7cb74c5-tznlc"] Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.633589 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd"] Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.635658 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-575f6dcb4d-6s8wd"] Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.640363 4626 scope.go:117] "RemoveContainer" containerID="02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f" Feb 23 06:45:23 crc kubenswrapper[4626]: E0223 06:45:23.641031 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f\": container with ID starting with 02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f not found: ID does not exist" containerID="02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f" Feb 23 06:45:23 crc kubenswrapper[4626]: I0223 06:45:23.641077 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f"} err="failed to get container status \"02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f\": rpc error: code = NotFound desc = could not find container \"02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f\": container with ID starting with 02be59d41a600f481700c8b9838b1881fddf4643cdb16716fe9ebec880de728f not found: ID does not exist" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.003801 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" path="/var/lib/kubelet/pods/7bcab753-1eb4-476b-bfd2-6d5f7da07f8b/volumes" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.004366 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" path="/var/lib/kubelet/pods/ae78e29d-6477-47e3-9ec1-1ebd651a5414/volumes" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.456950 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fc97cdcc-q8jdx"] Feb 23 06:45:24 crc kubenswrapper[4626]: E0223 06:45:24.457416 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" containerName="controller-manager" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.457437 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" containerName="controller-manager" Feb 23 06:45:24 crc kubenswrapper[4626]: E0223 06:45:24.457449 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" containerName="route-controller-manager" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.457457 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" containerName="route-controller-manager" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.457651 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bcab753-1eb4-476b-bfd2-6d5f7da07f8b" containerName="route-controller-manager" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.457673 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae78e29d-6477-47e3-9ec1-1ebd651a5414" containerName="controller-manager" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.458461 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.462242 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.462400 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.462246 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.462706 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.463343 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.463669 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj"] Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.464766 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.464833 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.471208 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj"] Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.472442 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.472577 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.472695 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.472747 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.473027 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.473086 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.475471 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fc97cdcc-q8jdx"] Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.478630 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572157 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h26vp\" (UniqueName: \"kubernetes.io/projected/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-kube-api-access-h26vp\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572458 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42aeee91-98b8-4819-94fd-b1caa0d70713-client-ca\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572491 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-client-ca\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572530 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-proxy-ca-bundles\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572550 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42aeee91-98b8-4819-94fd-b1caa0d70713-serving-cert\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572575 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78qcz\" (UniqueName: \"kubernetes.io/projected/42aeee91-98b8-4819-94fd-b1caa0d70713-kube-api-access-78qcz\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572594 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-config\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572615 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-serving-cert\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.572642 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42aeee91-98b8-4819-94fd-b1caa0d70713-config\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.675995 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h26vp\" (UniqueName: \"kubernetes.io/projected/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-kube-api-access-h26vp\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676093 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42aeee91-98b8-4819-94fd-b1caa0d70713-client-ca\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676136 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-client-ca\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676156 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-proxy-ca-bundles\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676182 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42aeee91-98b8-4819-94fd-b1caa0d70713-serving-cert\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676213 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78qcz\" (UniqueName: \"kubernetes.io/projected/42aeee91-98b8-4819-94fd-b1caa0d70713-kube-api-access-78qcz\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676235 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-config\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676259 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-serving-cert\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.676300 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42aeee91-98b8-4819-94fd-b1caa0d70713-config\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.677254 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42aeee91-98b8-4819-94fd-b1caa0d70713-client-ca\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.677520 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-client-ca\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.678634 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42aeee91-98b8-4819-94fd-b1caa0d70713-config\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.678718 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-config\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.678666 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-proxy-ca-bundles\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.684445 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42aeee91-98b8-4819-94fd-b1caa0d70713-serving-cert\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.685152 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-serving-cert\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.694125 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78qcz\" (UniqueName: \"kubernetes.io/projected/42aeee91-98b8-4819-94fd-b1caa0d70713-kube-api-access-78qcz\") pod \"route-controller-manager-85f89695c9-csqmj\" (UID: \"42aeee91-98b8-4819-94fd-b1caa0d70713\") " pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.694334 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h26vp\" (UniqueName: \"kubernetes.io/projected/987eefe3-3106-43db-a0f0-6c3ed6fa8e9a-kube-api-access-h26vp\") pod \"controller-manager-fc97cdcc-q8jdx\" (UID: \"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a\") " pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.777567 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.787412 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.807934 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bn5h9"] Feb 23 06:45:24 crc kubenswrapper[4626]: I0223 06:45:24.808291 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bn5h9" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="registry-server" containerID="cri-o://e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf" gracePeriod=2 Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.106588 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.169657 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fc97cdcc-q8jdx"] Feb 23 06:45:25 crc kubenswrapper[4626]: W0223 06:45:25.173098 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987eefe3_3106_43db_a0f0_6c3ed6fa8e9a.slice/crio-57e1e2fdc5457deca0dd307667acadf4950ee3cbdb66134f59abe79ee3b67940 WatchSource:0}: Error finding container 57e1e2fdc5457deca0dd307667acadf4950ee3cbdb66134f59abe79ee3b67940: Status 404 returned error can't find the container with id 57e1e2fdc5457deca0dd307667acadf4950ee3cbdb66134f59abe79ee3b67940 Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.208021 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj"] Feb 23 06:45:25 crc kubenswrapper[4626]: W0223 06:45:25.212797 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42aeee91_98b8_4819_94fd_b1caa0d70713.slice/crio-306ab04ccdd0ab8c8eb85fa22965d1a23ca7342b338a76d723f3275f291d4ab9 WatchSource:0}: Error finding container 306ab04ccdd0ab8c8eb85fa22965d1a23ca7342b338a76d723f3275f291d4ab9: Status 404 returned error can't find the container with id 306ab04ccdd0ab8c8eb85fa22965d1a23ca7342b338a76d723f3275f291d4ab9 Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.287211 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9bdc\" (UniqueName: \"kubernetes.io/projected/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-kube-api-access-c9bdc\") pod \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.287284 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-utilities\") pod \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.287322 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-catalog-content\") pod \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\" (UID: \"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b\") " Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.288990 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-utilities" (OuterVolumeSpecName: "utilities") pod "e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" (UID: "e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.291916 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-kube-api-access-c9bdc" (OuterVolumeSpecName: "kube-api-access-c9bdc") pod "e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" (UID: "e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b"). InnerVolumeSpecName "kube-api-access-c9bdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.388036 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" (UID: "e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.388925 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9bdc\" (UniqueName: \"kubernetes.io/projected/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-kube-api-access-c9bdc\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.388955 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.388994 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.593608 4626 generic.go:334] "Generic (PLEG): container finished" podID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerID="e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf" exitCode=0 Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.593753 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn5h9" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.594104 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerDied","Data":"e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf"} Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.594152 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn5h9" event={"ID":"e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b","Type":"ContainerDied","Data":"fa3d45d61ff27e5650c5ea5a99e03b7102abd0a6cb1730a64bead03ce723ca6d"} Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.594173 4626 scope.go:117] "RemoveContainer" containerID="e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.596880 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" event={"ID":"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a","Type":"ContainerStarted","Data":"b3f77f53ae67e07ff6a3784949ecc02563f59cd14844eb2ad27230af69476a96"} Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.596923 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" event={"ID":"987eefe3-3106-43db-a0f0-6c3ed6fa8e9a","Type":"ContainerStarted","Data":"57e1e2fdc5457deca0dd307667acadf4950ee3cbdb66134f59abe79ee3b67940"} Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.597092 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.598333 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" event={"ID":"42aeee91-98b8-4819-94fd-b1caa0d70713","Type":"ContainerStarted","Data":"500186dd7f80e26642327c678b8256e5bf94fc031a4dc9a66af4f9ed16b6f19d"} Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.598366 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" event={"ID":"42aeee91-98b8-4819-94fd-b1caa0d70713","Type":"ContainerStarted","Data":"306ab04ccdd0ab8c8eb85fa22965d1a23ca7342b338a76d723f3275f291d4ab9"} Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.598755 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.600929 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.615183 4626 scope.go:117] "RemoveContainer" containerID="4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.619063 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bn5h9"] Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.633242 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bn5h9"] Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.644898 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" podStartSLOduration=3.644881383 podStartE2EDuration="3.644881383s" podCreationTimestamp="2026-02-23 06:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:25.637291847 +0000 UTC m=+277.976621113" watchObservedRunningTime="2026-02-23 06:45:25.644881383 +0000 UTC m=+277.984210649" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.646857 4626 scope.go:117] "RemoveContainer" containerID="25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.660617 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fc97cdcc-q8jdx" podStartSLOduration=3.6606052 podStartE2EDuration="3.6606052s" podCreationTimestamp="2026-02-23 06:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:45:25.657987411 +0000 UTC m=+277.997316678" watchObservedRunningTime="2026-02-23 06:45:25.6606052 +0000 UTC m=+277.999934466" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.662722 4626 scope.go:117] "RemoveContainer" containerID="e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf" Feb 23 06:45:25 crc kubenswrapper[4626]: E0223 06:45:25.663952 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf\": container with ID starting with e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf not found: ID does not exist" containerID="e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.664003 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf"} err="failed to get container status \"e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf\": rpc error: code = NotFound desc = could not find container \"e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf\": container with ID starting with e9d1ffad701c394fb2681c4497cf9dec3a89a4a1f28f7d2815f12dd5cdd5b8cf not found: ID does not exist" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.664030 4626 scope.go:117] "RemoveContainer" containerID="4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e" Feb 23 06:45:25 crc kubenswrapper[4626]: E0223 06:45:25.667579 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e\": container with ID starting with 4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e not found: ID does not exist" containerID="4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.667610 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e"} err="failed to get container status \"4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e\": rpc error: code = NotFound desc = could not find container \"4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e\": container with ID starting with 4efc777ee29d18266fc04cfcc7fb3ab101565b176d0830f0595acfac9e47620e not found: ID does not exist" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.667625 4626 scope.go:117] "RemoveContainer" containerID="25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0" Feb 23 06:45:25 crc kubenswrapper[4626]: E0223 06:45:25.667904 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0\": container with ID starting with 25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0 not found: ID does not exist" containerID="25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.668026 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0"} err="failed to get container status \"25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0\": rpc error: code = NotFound desc = could not find container \"25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0\": container with ID starting with 25a38af6f2d8c3ceb846d6ee108fb7194661784e0eca140692de2deda4f200c0 not found: ID does not exist" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.685386 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.685535 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.685648 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.686224 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.686337 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8" gracePeriod=600 Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.726707 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85f89695c9-csqmj" Feb 23 06:45:25 crc kubenswrapper[4626]: I0223 06:45:25.988937 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" path="/var/lib/kubelet/pods/e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b/volumes" Feb 23 06:45:26 crc kubenswrapper[4626]: I0223 06:45:26.608588 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8" exitCode=0 Feb 23 06:45:26 crc kubenswrapper[4626]: I0223 06:45:26.608675 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8"} Feb 23 06:45:26 crc kubenswrapper[4626]: I0223 06:45:26.608753 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"5b478f95f1f86a5654dee4643a7ca0e01deee382932ab3857cea5b660341ebe5"} Feb 23 06:45:27 crc kubenswrapper[4626]: I0223 06:45:27.853184 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:45:28 crc kubenswrapper[4626]: I0223 06:45:28.087383 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:45:28 crc kubenswrapper[4626]: I0223 06:45:28.120028 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:45:28 crc kubenswrapper[4626]: I0223 06:45:28.254015 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:45:28 crc kubenswrapper[4626]: I0223 06:45:28.456275 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.207638 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t66n4"] Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.208353 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t66n4" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="registry-server" containerID="cri-o://407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354" gracePeriod=2 Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.408361 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p8ddh"] Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.408805 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p8ddh" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="registry-server" containerID="cri-o://65221602ee94cbd641035d68af794c5db14da3277e2fa4ebeacabc3e57fd3e33" gracePeriod=2 Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.609883 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.639332 4626 generic.go:334] "Generic (PLEG): container finished" podID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerID="65221602ee94cbd641035d68af794c5db14da3277e2fa4ebeacabc3e57fd3e33" exitCode=0 Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.639407 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8ddh" event={"ID":"850794ce-0d65-45b4-81be-15dd522ab8fc","Type":"ContainerDied","Data":"65221602ee94cbd641035d68af794c5db14da3277e2fa4ebeacabc3e57fd3e33"} Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.641176 4626 generic.go:334] "Generic (PLEG): container finished" podID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerID="407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354" exitCode=0 Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.641207 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t66n4" event={"ID":"1f7808e4-0cf2-47e0-aa06-62282f4111db","Type":"ContainerDied","Data":"407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354"} Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.641231 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t66n4" event={"ID":"1f7808e4-0cf2-47e0-aa06-62282f4111db","Type":"ContainerDied","Data":"529243a204f9102a604fd11720b5e4964bce5b14360a2e69fd26ade01930c54a"} Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.641254 4626 scope.go:117] "RemoveContainer" containerID="407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.641384 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t66n4" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.678674 4626 scope.go:117] "RemoveContainer" containerID="747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.700028 4626 scope.go:117] "RemoveContainer" containerID="e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.716853 4626 scope.go:117] "RemoveContainer" containerID="407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354" Feb 23 06:45:30 crc kubenswrapper[4626]: E0223 06:45:30.717621 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354\": container with ID starting with 407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354 not found: ID does not exist" containerID="407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.717666 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354"} err="failed to get container status \"407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354\": rpc error: code = NotFound desc = could not find container \"407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354\": container with ID starting with 407a22e2c221ddaf5fdf405b6dfb563e61412cb787518288d0085575baf78354 not found: ID does not exist" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.717696 4626 scope.go:117] "RemoveContainer" containerID="747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c" Feb 23 06:45:30 crc kubenswrapper[4626]: E0223 06:45:30.718582 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c\": container with ID starting with 747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c not found: ID does not exist" containerID="747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.718609 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c"} err="failed to get container status \"747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c\": rpc error: code = NotFound desc = could not find container \"747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c\": container with ID starting with 747084baa497f738cd04fe13c6e2957177feab2695ed88399215e6298d03835c not found: ID does not exist" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.718626 4626 scope.go:117] "RemoveContainer" containerID="e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3" Feb 23 06:45:30 crc kubenswrapper[4626]: E0223 06:45:30.719103 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3\": container with ID starting with e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3 not found: ID does not exist" containerID="e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.719155 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3"} err="failed to get container status \"e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3\": rpc error: code = NotFound desc = could not find container \"e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3\": container with ID starting with e1b2ddc154c938443b11895bf1f4a52910c286c881abab0ab7b77387dd4148e3 not found: ID does not exist" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.758708 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.764425 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-utilities\") pod \"1f7808e4-0cf2-47e0-aa06-62282f4111db\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.764525 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-catalog-content\") pod \"1f7808e4-0cf2-47e0-aa06-62282f4111db\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.764701 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj87f\" (UniqueName: \"kubernetes.io/projected/1f7808e4-0cf2-47e0-aa06-62282f4111db-kube-api-access-vj87f\") pod \"1f7808e4-0cf2-47e0-aa06-62282f4111db\" (UID: \"1f7808e4-0cf2-47e0-aa06-62282f4111db\") " Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.765476 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-utilities" (OuterVolumeSpecName: "utilities") pod "1f7808e4-0cf2-47e0-aa06-62282f4111db" (UID: "1f7808e4-0cf2-47e0-aa06-62282f4111db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.765722 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-catalog-content\") pod \"850794ce-0d65-45b4-81be-15dd522ab8fc\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.766573 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.771187 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7808e4-0cf2-47e0-aa06-62282f4111db-kube-api-access-vj87f" (OuterVolumeSpecName: "kube-api-access-vj87f") pod "1f7808e4-0cf2-47e0-aa06-62282f4111db" (UID: "1f7808e4-0cf2-47e0-aa06-62282f4111db"). InnerVolumeSpecName "kube-api-access-vj87f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.822991 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "850794ce-0d65-45b4-81be-15dd522ab8fc" (UID: "850794ce-0d65-45b4-81be-15dd522ab8fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.826363 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f7808e4-0cf2-47e0-aa06-62282f4111db" (UID: "1f7808e4-0cf2-47e0-aa06-62282f4111db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.867688 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-utilities\") pod \"850794ce-0d65-45b4-81be-15dd522ab8fc\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.867845 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsdbd\" (UniqueName: \"kubernetes.io/projected/850794ce-0d65-45b4-81be-15dd522ab8fc-kube-api-access-xsdbd\") pod \"850794ce-0d65-45b4-81be-15dd522ab8fc\" (UID: \"850794ce-0d65-45b4-81be-15dd522ab8fc\") " Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.868140 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj87f\" (UniqueName: \"kubernetes.io/projected/1f7808e4-0cf2-47e0-aa06-62282f4111db-kube-api-access-vj87f\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.868158 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.868170 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f7808e4-0cf2-47e0-aa06-62282f4111db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.868308 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-utilities" (OuterVolumeSpecName: "utilities") pod "850794ce-0d65-45b4-81be-15dd522ab8fc" (UID: "850794ce-0d65-45b4-81be-15dd522ab8fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.871024 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850794ce-0d65-45b4-81be-15dd522ab8fc-kube-api-access-xsdbd" (OuterVolumeSpecName: "kube-api-access-xsdbd") pod "850794ce-0d65-45b4-81be-15dd522ab8fc" (UID: "850794ce-0d65-45b4-81be-15dd522ab8fc"). InnerVolumeSpecName "kube-api-access-xsdbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.969725 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850794ce-0d65-45b4-81be-15dd522ab8fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.969868 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsdbd\" (UniqueName: \"kubernetes.io/projected/850794ce-0d65-45b4-81be-15dd522ab8fc-kube-api-access-xsdbd\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.973987 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t66n4"] Feb 23 06:45:30 crc kubenswrapper[4626]: I0223 06:45:30.978630 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t66n4"] Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.651241 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p8ddh" event={"ID":"850794ce-0d65-45b4-81be-15dd522ab8fc","Type":"ContainerDied","Data":"b54c993cbab9f63e60d5b8097dab0ffaa664a6330a6c16c976ee143bdbc1c633"} Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.651332 4626 scope.go:117] "RemoveContainer" containerID="65221602ee94cbd641035d68af794c5db14da3277e2fa4ebeacabc3e57fd3e33" Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.651542 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p8ddh" Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.667667 4626 scope.go:117] "RemoveContainer" containerID="fe3c7a8d118d8f8e1b21878c0771b346817fb85b37b47f39c2e9c94204172ce5" Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.688576 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p8ddh"] Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.694507 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p8ddh"] Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.700615 4626 scope.go:117] "RemoveContainer" containerID="0aa5ad3c06fe369a8cc17b6f385634c5c5e2bfb0ee5bd51d7f8798ba788880b9" Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.989390 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" path="/var/lib/kubelet/pods/1f7808e4-0cf2-47e0-aa06-62282f4111db/volumes" Feb 23 06:45:31 crc kubenswrapper[4626]: I0223 06:45:31.990318 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" path="/var/lib/kubelet/pods/850794ce-0d65-45b4-81be-15dd522ab8fc/volumes" Feb 23 06:45:39 crc kubenswrapper[4626]: I0223 06:45:39.513563 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8j9bj"] Feb 23 06:45:47 crc kubenswrapper[4626]: I0223 06:45:47.872154 4626 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.499933 4626 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500627 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500645 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500661 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="extract-content" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500669 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="extract-content" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500682 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="extract-utilities" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500688 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="extract-utilities" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500696 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500701 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500713 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="extract-content" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500719 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="extract-content" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500728 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="extract-utilities" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500733 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="extract-utilities" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500740 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="extract-utilities" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500746 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="extract-utilities" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500756 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="extract-content" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500762 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="extract-content" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.500772 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500779 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500929 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fcae3c-81d2-43d3-bd1d-f4fd4ffd104b" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500946 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="850794ce-0d65-45b4-81be-15dd522ab8fc" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.500961 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7808e4-0cf2-47e0-aa06-62282f4111db" containerName="registry-server" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.501358 4626 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.501631 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502027 4626 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502119 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf" gracePeriod=15 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502150 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1" gracePeriod=15 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502144 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287" gracePeriod=15 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502237 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1" gracePeriod=15 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502150 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78" gracePeriod=15 Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502443 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502471 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502485 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502513 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502527 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502535 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502543 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502549 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502574 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502581 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502589 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502595 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502604 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502610 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.502620 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502627 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.502784 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503007 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503013 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503029 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503036 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503044 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503053 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.503195 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503204 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503315 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503336 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: E0223 06:45:53.503465 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.503473 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.508940 4626 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638125 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638178 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638206 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638252 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638293 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638317 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638335 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.638356 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740017 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740081 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740110 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740150 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740182 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740200 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740195 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740245 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740277 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740249 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740301 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740311 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740360 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740388 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740398 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.740455 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.788838 4626 generic.go:334] "Generic (PLEG): container finished" podID="e121b029-d37d-4211-84f6-348ffc0a1686" containerID="7b967edc24977c69759a771bfbf57c4f995faecb37b7a5e9b7329fd561ba38ed" exitCode=0 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.788947 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e121b029-d37d-4211-84f6-348ffc0a1686","Type":"ContainerDied","Data":"7b967edc24977c69759a771bfbf57c4f995faecb37b7a5e9b7329fd561ba38ed"} Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.789944 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.791490 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.792946 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.793716 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1" exitCode=0 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.793745 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78" exitCode=0 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.793755 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf" exitCode=0 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.793764 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1" exitCode=2 Feb 23 06:45:53 crc kubenswrapper[4626]: I0223 06:45:53.793822 4626 scope.go:117] "RemoveContainer" containerID="c979ccaa67e8232dc93e11ddfcaa4651d7932c9ac6ea752f2488216659a6509c" Feb 23 06:45:54 crc kubenswrapper[4626]: I0223 06:45:54.803836 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.111858 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.112873 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257177 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-kubelet-dir\") pod \"e121b029-d37d-4211-84f6-348ffc0a1686\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257240 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-var-lock\") pod \"e121b029-d37d-4211-84f6-348ffc0a1686\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257297 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e121b029-d37d-4211-84f6-348ffc0a1686" (UID: "e121b029-d37d-4211-84f6-348ffc0a1686"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257306 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e121b029-d37d-4211-84f6-348ffc0a1686-kube-api-access\") pod \"e121b029-d37d-4211-84f6-348ffc0a1686\" (UID: \"e121b029-d37d-4211-84f6-348ffc0a1686\") " Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257411 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-var-lock" (OuterVolumeSpecName: "var-lock") pod "e121b029-d37d-4211-84f6-348ffc0a1686" (UID: "e121b029-d37d-4211-84f6-348ffc0a1686"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257650 4626 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.257670 4626 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e121b029-d37d-4211-84f6-348ffc0a1686-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.264029 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e121b029-d37d-4211-84f6-348ffc0a1686-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e121b029-d37d-4211-84f6-348ffc0a1686" (UID: "e121b029-d37d-4211-84f6-348ffc0a1686"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.358560 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e121b029-d37d-4211-84f6-348ffc0a1686-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.814189 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e121b029-d37d-4211-84f6-348ffc0a1686","Type":"ContainerDied","Data":"2b206815a72e4bb24a49e369c1a6558707c0be86c764e99a0e305c4ae5ac4191"} Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.814674 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b206815a72e4bb24a49e369c1a6558707c0be86c764e99a0e305c4ae5ac4191" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.814470 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.850832 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.855150 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.856166 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.856766 4626 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.857141 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.966681 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.966823 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.966938 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.966854 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.966813 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.967006 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.967348 4626 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.967421 4626 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.967482 4626 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:45:55 crc kubenswrapper[4626]: I0223 06:45:55.987957 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.825935 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.826664 4626 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287" exitCode=0 Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.826720 4626 scope.go:117] "RemoveContainer" containerID="a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.826857 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.827437 4626 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.827810 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.831483 4626 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.832088 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.846443 4626 scope.go:117] "RemoveContainer" containerID="b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.862040 4626 scope.go:117] "RemoveContainer" containerID="9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.874738 4626 scope.go:117] "RemoveContainer" containerID="4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.892281 4626 scope.go:117] "RemoveContainer" containerID="682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.911178 4626 scope.go:117] "RemoveContainer" containerID="6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.930212 4626 scope.go:117] "RemoveContainer" containerID="a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1" Feb 23 06:45:56 crc kubenswrapper[4626]: E0223 06:45:56.930718 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\": container with ID starting with a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1 not found: ID does not exist" containerID="a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.930758 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1"} err="failed to get container status \"a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\": rpc error: code = NotFound desc = could not find container \"a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1\": container with ID starting with a482357128db4096e91724d8c53c4ba20d2d64fa76f4e66c3c4982fbfc74b1c1 not found: ID does not exist" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.930787 4626 scope.go:117] "RemoveContainer" containerID="b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78" Feb 23 06:45:56 crc kubenswrapper[4626]: E0223 06:45:56.931220 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\": container with ID starting with b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78 not found: ID does not exist" containerID="b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.931244 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78"} err="failed to get container status \"b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\": rpc error: code = NotFound desc = could not find container \"b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78\": container with ID starting with b9017faa209ca6f695646d296d5f26ebc4b9157b09c7471419d41af716ff2d78 not found: ID does not exist" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.931263 4626 scope.go:117] "RemoveContainer" containerID="9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf" Feb 23 06:45:56 crc kubenswrapper[4626]: E0223 06:45:56.931605 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\": container with ID starting with 9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf not found: ID does not exist" containerID="9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.931665 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf"} err="failed to get container status \"9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\": rpc error: code = NotFound desc = could not find container \"9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf\": container with ID starting with 9157f859f040c88f75fdf2cd92e5997b08420126216caac9d22e5ad9b6908faf not found: ID does not exist" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.931684 4626 scope.go:117] "RemoveContainer" containerID="4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1" Feb 23 06:45:56 crc kubenswrapper[4626]: E0223 06:45:56.932213 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\": container with ID starting with 4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1 not found: ID does not exist" containerID="4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.932263 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1"} err="failed to get container status \"4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\": rpc error: code = NotFound desc = could not find container \"4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1\": container with ID starting with 4dc1563ce9e3335a34e05d7f9fe0740af966a8e5e361f1cb0483d72a924134a1 not found: ID does not exist" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.932294 4626 scope.go:117] "RemoveContainer" containerID="682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287" Feb 23 06:45:56 crc kubenswrapper[4626]: E0223 06:45:56.933078 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\": container with ID starting with 682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287 not found: ID does not exist" containerID="682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.933108 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287"} err="failed to get container status \"682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\": rpc error: code = NotFound desc = could not find container \"682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287\": container with ID starting with 682b581e87d514099ad2c4f61770cfa80fbee55e47ea30178a3b6dfc64536287 not found: ID does not exist" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.933122 4626 scope.go:117] "RemoveContainer" containerID="6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145" Feb 23 06:45:56 crc kubenswrapper[4626]: E0223 06:45:56.933746 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\": container with ID starting with 6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145 not found: ID does not exist" containerID="6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145" Feb 23 06:45:56 crc kubenswrapper[4626]: I0223 06:45:56.933778 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145"} err="failed to get container status \"6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\": rpc error: code = NotFound desc = could not find container \"6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145\": container with ID starting with 6ef324c603b78e13f16911c8c26782994acbd8ee48bdbe931c1230e246c2d145 not found: ID does not exist" Feb 23 06:45:57 crc kubenswrapper[4626]: I0223 06:45:57.984590 4626 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:57 crc kubenswrapper[4626]: I0223 06:45:57.984848 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:58 crc kubenswrapper[4626]: E0223 06:45:58.545079 4626 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:58 crc kubenswrapper[4626]: I0223 06:45:58.545646 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:58 crc kubenswrapper[4626]: E0223 06:45:58.566107 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd3b6efdc7ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:45:58.565758892 +0000 UTC m=+310.905088158,LastTimestamp:2026-02-23 06:45:58.565758892 +0000 UTC m=+310.905088158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:58 crc kubenswrapper[4626]: I0223 06:45:58.840779 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"55aecd2c0a1557547322d7a681f472a877514f2985c4d61c1ac156141ed34311"} Feb 23 06:45:58 crc kubenswrapper[4626]: I0223 06:45:58.841141 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7decb573ba8535ababb199ee96cf0120bc5be31f01ed582633dad6b6defb4b62"} Feb 23 06:45:58 crc kubenswrapper[4626]: E0223 06:45:58.841719 4626 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:45:58 crc kubenswrapper[4626]: I0223 06:45:58.841718 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:45:59 crc kubenswrapper[4626]: E0223 06:45:59.467001 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd3b6efdc7ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:45:58.565758892 +0000 UTC m=+310.905088158,LastTimestamp:2026-02-23 06:45:58.565758892 +0000 UTC m=+310.905088158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.185269 4626 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.185872 4626 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.186212 4626 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.186486 4626 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.186832 4626 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:00 crc kubenswrapper[4626]: I0223 06:46:00.186871 4626 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.187150 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="200ms" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.388286 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="400ms" Feb 23 06:46:00 crc kubenswrapper[4626]: E0223 06:46:00.789918 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="800ms" Feb 23 06:46:01 crc kubenswrapper[4626]: E0223 06:46:01.590999 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="1.6s" Feb 23 06:46:03 crc kubenswrapper[4626]: E0223 06:46:03.191727 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="3.2s" Feb 23 06:46:04 crc kubenswrapper[4626]: I0223 06:46:04.539542 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" containerName="oauth-openshift" containerID="cri-o://cd4100468081553e98efd6b7d5b6d91f5dcb7145f256dcb4d18540b438dbc450" gracePeriod=15 Feb 23 06:46:04 crc kubenswrapper[4626]: I0223 06:46:04.881546 4626 generic.go:334] "Generic (PLEG): container finished" podID="919789ac-a13f-430c-a00c-5ab73f8e8cba" containerID="cd4100468081553e98efd6b7d5b6d91f5dcb7145f256dcb4d18540b438dbc450" exitCode=0 Feb 23 06:46:04 crc kubenswrapper[4626]: I0223 06:46:04.881604 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" event={"ID":"919789ac-a13f-430c-a00c-5ab73f8e8cba","Type":"ContainerDied","Data":"cd4100468081553e98efd6b7d5b6d91f5dcb7145f256dcb4d18540b438dbc450"} Feb 23 06:46:04 crc kubenswrapper[4626]: I0223 06:46:04.915489 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:46:04 crc kubenswrapper[4626]: I0223 06:46:04.916116 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:04 crc kubenswrapper[4626]: I0223 06:46:04.916456 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073104 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-trusted-ca-bundle\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073178 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-idp-0-file-data\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073210 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-error\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073237 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-router-certs\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073257 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-login\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073844 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-policies\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073925 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-service-ca\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.073958 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-session\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.074021 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.074261 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.074393 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.074477 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-cliconfig\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075058 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.074601 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-provider-selection\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075153 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-ocp-branding-template\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075192 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-884bg\" (UniqueName: \"kubernetes.io/projected/919789ac-a13f-430c-a00c-5ab73f8e8cba-kube-api-access-884bg\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075224 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-dir\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075253 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-serving-cert\") pod \"919789ac-a13f-430c-a00c-5ab73f8e8cba\" (UID: \"919789ac-a13f-430c-a00c-5ab73f8e8cba\") " Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075765 4626 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075788 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075800 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.075813 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.080004 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.080552 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.081047 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919789ac-a13f-430c-a00c-5ab73f8e8cba-kube-api-access-884bg" (OuterVolumeSpecName: "kube-api-access-884bg") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "kube-api-access-884bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.081125 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.081277 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.081560 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.081867 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.086048 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.086814 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.087062 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "919789ac-a13f-430c-a00c-5ab73f8e8cba" (UID: "919789ac-a13f-430c-a00c-5ab73f8e8cba"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.176956 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.176987 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.176998 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177008 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177018 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177030 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177044 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177054 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-884bg\" (UniqueName: \"kubernetes.io/projected/919789ac-a13f-430c-a00c-5ab73f8e8cba-kube-api-access-884bg\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177064 4626 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/919789ac-a13f-430c-a00c-5ab73f8e8cba-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.177073 4626 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/919789ac-a13f-430c-a00c-5ab73f8e8cba-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.890931 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" event={"ID":"919789ac-a13f-430c-a00c-5ab73f8e8cba","Type":"ContainerDied","Data":"d0761d5522b0826edfff4027d21666f1181d0d9a00055424be2bde7e73397c74"} Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.891013 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.891021 4626 scope.go:117] "RemoveContainer" containerID="cd4100468081553e98efd6b7d5b6d91f5dcb7145f256dcb4d18540b438dbc450" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.892361 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.892606 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.909845 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:05 crc kubenswrapper[4626]: I0223 06:46:05.910385 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:06 crc kubenswrapper[4626]: E0223 06:46:06.393011 4626 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.58:6443: connect: connection refused" interval="6.4s" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.805307 4626 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.805380 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.906623 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.907870 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.907927 4626 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335" exitCode=1 Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.907963 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335"} Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.908340 4626 scope.go:117] "RemoveContainer" containerID="6b79c01946e9f21e84f63573aec4dfe15408ce260d0c009b72abed6fa20c9335" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.908678 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.908978 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.909240 4626 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.984600 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.985148 4626 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[4626]: I0223 06:46:07.985357 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.915611 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.917761 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.917820 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a1f515354141293c5122e82d7a7534a2fbc70f692e9833359083e14f36c21ca9"} Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.918648 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.918983 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.919342 4626 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.982072 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.982667 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.982936 4626 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.983209 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.993022 4626 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.993047 4626 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:08 crc kubenswrapper[4626]: E0223 06:46:08.993364 4626 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:08 crc kubenswrapper[4626]: I0223 06:46:08.993880 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:09 crc kubenswrapper[4626]: W0223 06:46:09.012214 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a1131250c73a5c6028bd0722a377fd45795aa555742c23a1ca81375810d5f8ee WatchSource:0}: Error finding container a1131250c73a5c6028bd0722a377fd45795aa555742c23a1ca81375810d5f8ee: Status 404 returned error can't find the container with id a1131250c73a5c6028bd0722a377fd45795aa555742c23a1ca81375810d5f8ee Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.362670 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.366990 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.367583 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.367991 4626 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.368210 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[4626]: E0223 06:46:09.468040 4626 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd3b6efdc7ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:45:58.565758892 +0000 UTC m=+310.905088158,LastTimestamp:2026-02-23 06:45:58.565758892 +0000 UTC m=+310.905088158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.925186 4626 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9e87a0d4ba2a444f133b2555cbe4e63bb1924d13d18d22e71e96bf277e6e24b5" exitCode=0 Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.925242 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9e87a0d4ba2a444f133b2555cbe4e63bb1924d13d18d22e71e96bf277e6e24b5"} Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.925319 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a1131250c73a5c6028bd0722a377fd45795aa555742c23a1ca81375810d5f8ee"} Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.925564 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.925764 4626 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.925779 4626 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.926118 4626 status_manager.go:851] "Failed to get status for pod" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[4626]: E0223 06:46:09.926243 4626 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.926443 4626 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[4626]: I0223 06:46:09.926715 4626 status_manager.go:851] "Failed to get status for pod" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" pod="openshift-authentication/oauth-openshift-558db77b4-8j9bj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8j9bj\": dial tcp 192.168.26.58:6443: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[4626]: E0223 06:46:10.025792 4626 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.26.58:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" volumeName="registry-storage" Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.946059 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cfc1f533ea71187b2eb6f9d36da31a27675951a434661e6ebd4c247968c44b9b"} Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.946652 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"97c7eff4f645be621975a1514547165cc06056544df0ec7b85ba13ffcab15a83"} Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.946666 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c51d4c7615c3c50b511aeb94bcd6402a81ec45981b5865586c3617fb89cd1334"} Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.946677 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8a5916b61d8c6a30c41e841d1646fd06a755a2a78d808128186b639fc8402851"} Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.946686 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c8db61e9db9c11ae07876f4c7f8a2b91f4fdfeefbfbcb73faae46c6199935445"} Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.946995 4626 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.947011 4626 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:10 crc kubenswrapper[4626]: I0223 06:46:10.947002 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:13 crc kubenswrapper[4626]: I0223 06:46:13.995014 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:13 crc kubenswrapper[4626]: I0223 06:46:13.995406 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:13 crc kubenswrapper[4626]: I0223 06:46:13.999450 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:16 crc kubenswrapper[4626]: I0223 06:46:16.484964 4626 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:16 crc kubenswrapper[4626]: I0223 06:46:16.978260 4626 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:16 crc kubenswrapper[4626]: I0223 06:46:16.978294 4626 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:16 crc kubenswrapper[4626]: I0223 06:46:16.981418 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:17 crc kubenswrapper[4626]: I0223 06:46:17.989237 4626 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:17 crc kubenswrapper[4626]: I0223 06:46:17.989648 4626 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:17 crc kubenswrapper[4626]: I0223 06:46:17.997686 4626 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fed4da1c-65c1-45e0-80be-b9ee459b7c34" Feb 23 06:46:26 crc kubenswrapper[4626]: I0223 06:46:26.204962 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 06:46:26 crc kubenswrapper[4626]: I0223 06:46:26.330755 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 06:46:26 crc kubenswrapper[4626]: I0223 06:46:26.657796 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 06:46:26 crc kubenswrapper[4626]: I0223 06:46:26.956932 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:27 crc kubenswrapper[4626]: I0223 06:46:27.326018 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 06:46:27 crc kubenswrapper[4626]: I0223 06:46:27.807715 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:27 crc kubenswrapper[4626]: I0223 06:46:27.962392 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.040620 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.090054 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.224760 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.286442 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.334521 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.560413 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.663884 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.710890 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.714694 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.746346 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.809555 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.903598 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 06:46:28 crc kubenswrapper[4626]: I0223 06:46:28.954765 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.163558 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.295695 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.406995 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.527135 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.528920 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.562543 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.605851 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.675985 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.711391 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.735648 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.785203 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.789440 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.851695 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.855036 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.872275 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 06:46:29 crc kubenswrapper[4626]: I0223 06:46:29.966817 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.026157 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.104587 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.136410 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.195901 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.313826 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.527325 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.659232 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.788007 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.802541 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.825817 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.860705 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.889256 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.895908 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.896891 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 06:46:30 crc kubenswrapper[4626]: I0223 06:46:30.899585 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.064396 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.079445 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.126373 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.146235 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.179284 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.301106 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.366235 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.387090 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.458695 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.573079 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.579661 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.618036 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.633447 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.644619 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.704667 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.783968 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.811618 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.880422 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.915593 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.916340 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 06:46:31 crc kubenswrapper[4626]: I0223 06:46:31.942549 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.063509 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.105519 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.198284 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.242665 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.261702 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.293278 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.318551 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.344222 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.367411 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.409012 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.483383 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.526132 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.579280 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.754405 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.760308 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.813780 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.821428 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.947378 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[4626]: I0223 06:46:32.948323 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.013769 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.130279 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.160874 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.311440 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.382934 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.418273 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.485203 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.656810 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.752553 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.854102 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.854754 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.949727 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 06:46:33 crc kubenswrapper[4626]: I0223 06:46:33.988605 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.015338 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.078216 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.081341 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.086508 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.162842 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.172948 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.194800 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.315119 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.431086 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.473294 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.510886 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.586697 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.612921 4626 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.652979 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.715416 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.724949 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.738076 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.756792 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.804474 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.809048 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 06:46:34 crc kubenswrapper[4626]: I0223 06:46:34.857352 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.002189 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.039047 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.049326 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.061383 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.136206 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.189464 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.400728 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.431767 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.447767 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.618040 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.686694 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.892652 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.959072 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 06:46:35 crc kubenswrapper[4626]: I0223 06:46:35.999000 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.060225 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.093015 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.143069 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.166116 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.199231 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.250487 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.343299 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.529803 4626 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.542399 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.605162 4626 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610383 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8j9bj","openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610449 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58cd8c9949-5prqt","openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:46:36 crc kubenswrapper[4626]: E0223 06:46:36.610743 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" containerName="installer" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610767 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" containerName="installer" Feb 23 06:46:36 crc kubenswrapper[4626]: E0223 06:46:36.610783 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" containerName="oauth-openshift" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610790 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" containerName="oauth-openshift" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610840 4626 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610884 4626 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5e2fb2c0-d7f4-4160-9cd6-c0d0468649e3" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.610986 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" containerName="oauth-openshift" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.611009 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e121b029-d37d-4211-84f6-348ffc0a1686" containerName="installer" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.611727 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.613128 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.614541 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.615827 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616076 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616121 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616460 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616722 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616881 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616959 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.616989 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.618349 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.620393 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.620606 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.624769 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.627826 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.632650 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.632897 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-error\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.632943 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aff37477-ecbc-43b2-8b7e-523e29044fbf-audit-dir\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.632967 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-session\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.632995 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633023 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-audit-policies\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633054 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633073 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633114 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633157 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-service-ca\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633189 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633207 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-login\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633226 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633239 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633269 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-router-certs\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.633300 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt96n\" (UniqueName: \"kubernetes.io/projected/aff37477-ecbc-43b2-8b7e-523e29044fbf-kube-api-access-tt96n\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.654264 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.654249397 podStartE2EDuration="20.654249397s" podCreationTimestamp="2026-02-23 06:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:36.649457889 +0000 UTC m=+348.988787156" watchObservedRunningTime="2026-02-23 06:46:36.654249397 +0000 UTC m=+348.993578663" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.683715 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.733901 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.733935 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-login\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.733961 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.733981 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-router-certs\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734005 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt96n\" (UniqueName: \"kubernetes.io/projected/aff37477-ecbc-43b2-8b7e-523e29044fbf-kube-api-access-tt96n\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734023 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-error\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734086 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aff37477-ecbc-43b2-8b7e-523e29044fbf-audit-dir\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734105 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-session\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734122 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734138 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-audit-policies\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734157 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734172 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734198 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734221 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-service-ca\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.734870 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-service-ca\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.735147 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.735337 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-audit-policies\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.736325 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.736561 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aff37477-ecbc-43b2-8b7e-523e29044fbf-audit-dir\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.742371 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-login\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.742430 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.742654 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-router-certs\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.742676 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-error\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.743715 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.744067 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-session\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.747837 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.748131 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/aff37477-ecbc-43b2-8b7e-523e29044fbf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.750376 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt96n\" (UniqueName: \"kubernetes.io/projected/aff37477-ecbc-43b2-8b7e-523e29044fbf-kube-api-access-tt96n\") pod \"oauth-openshift-58cd8c9949-5prqt\" (UID: \"aff37477-ecbc-43b2-8b7e-523e29044fbf\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.766869 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.805032 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.819632 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.829041 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.862046 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.936393 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:36 crc kubenswrapper[4626]: I0223 06:46:36.948129 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.044443 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.163131 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.266695 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.296130 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.327567 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.446258 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.583048 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.643089 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.905994 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.922446 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.944098 4626 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.944357 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://55aecd2c0a1557547322d7a681f472a877514f2985c4d61c1ac156141ed34311" gracePeriod=5 Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.950622 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.965402 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.973149 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.986630 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 06:46:37 crc kubenswrapper[4626]: I0223 06:46:37.988323 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919789ac-a13f-430c-a00c-5ab73f8e8cba" path="/var/lib/kubelet/pods/919789ac-a13f-430c-a00c-5ab73f8e8cba/volumes" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.067006 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.079838 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.081543 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.166769 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.222802 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.224722 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.231035 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.236186 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.390853 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.432733 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.480427 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.621046 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.655252 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.780381 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.784140 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.813713 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.839328 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.880595 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:46:38 crc kubenswrapper[4626]: I0223 06:46:38.998459 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.062692 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.161148 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.165886 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.167968 4626 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.209689 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.375857 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.418273 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.464900 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.471834 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.659536 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.705185 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.749150 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 06:46:39 crc kubenswrapper[4626]: I0223 06:46:39.918123 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.022048 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.114351 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.359462 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.544637 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.639094 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.640103 4626 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.707486 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.720944 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.741122 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58cd8c9949-5prqt"] Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.744124 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 06:46:40 crc kubenswrapper[4626]: I0223 06:46:40.766950 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.147651 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58cd8c9949-5prqt"] Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.322207 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.459016 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.468622 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.477985 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.610521 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.620944 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.828738 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 06:46:41 crc kubenswrapper[4626]: I0223 06:46:41.924460 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.080815 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.107988 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" event={"ID":"aff37477-ecbc-43b2-8b7e-523e29044fbf","Type":"ContainerStarted","Data":"6b43b0990e36349f5f97742c87f5e9b88808b0c982eac69357c3badd2cbc2b52"} Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.108058 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" event={"ID":"aff37477-ecbc-43b2-8b7e-523e29044fbf","Type":"ContainerStarted","Data":"129127427d46726c759dd27c8ffb405d1c52e5f67a9bfacaeed503d1f45e364f"} Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.108526 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.113710 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.131798 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58cd8c9949-5prqt" podStartSLOduration=63.131774801 podStartE2EDuration="1m3.131774801s" podCreationTimestamp="2026-02-23 06:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:42.128137335 +0000 UTC m=+354.467466601" watchObservedRunningTime="2026-02-23 06:46:42.131774801 +0000 UTC m=+354.471104067" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.190452 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.232300 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.300464 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.523944 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.687381 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.845956 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.919988 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 06:46:42 crc kubenswrapper[4626]: I0223 06:46:42.951329 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.114643 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.114700 4626 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="55aecd2c0a1557547322d7a681f472a877514f2985c4d61c1ac156141ed34311" exitCode=137 Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.396527 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.516897 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.516994 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626225 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626290 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626336 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626357 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626439 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626736 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626727 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626722 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626821 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.626954 4626 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.627005 4626 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.627015 4626 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.635215 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.727557 4626 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.727918 4626 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:43 crc kubenswrapper[4626]: I0223 06:46:43.987976 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 23 06:46:44 crc kubenswrapper[4626]: I0223 06:46:44.120602 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:46:44 crc kubenswrapper[4626]: I0223 06:46:44.120746 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:46:44 crc kubenswrapper[4626]: I0223 06:46:44.120811 4626 scope.go:117] "RemoveContainer" containerID="55aecd2c0a1557547322d7a681f472a877514f2985c4d61c1ac156141ed34311" Feb 23 06:46:57 crc kubenswrapper[4626]: I0223 06:46:57.199517 4626 generic.go:334] "Generic (PLEG): container finished" podID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerID="a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189" exitCode=0 Feb 23 06:46:57 crc kubenswrapper[4626]: I0223 06:46:57.199605 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" event={"ID":"c98d4d0e-9a53-4b64-a26d-11eda45f90fa","Type":"ContainerDied","Data":"a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189"} Feb 23 06:46:57 crc kubenswrapper[4626]: I0223 06:46:57.200404 4626 scope.go:117] "RemoveContainer" containerID="a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189" Feb 23 06:46:57 crc kubenswrapper[4626]: I0223 06:46:57.711656 4626 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:58 crc kubenswrapper[4626]: I0223 06:46:58.207610 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" event={"ID":"c98d4d0e-9a53-4b64-a26d-11eda45f90fa","Type":"ContainerStarted","Data":"e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929"} Feb 23 06:46:58 crc kubenswrapper[4626]: I0223 06:46:58.208794 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:46:58 crc kubenswrapper[4626]: I0223 06:46:58.213266 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:47:02 crc kubenswrapper[4626]: I0223 06:47:02.961132 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:47:05 crc kubenswrapper[4626]: I0223 06:47:05.567915 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 06:47:05 crc kubenswrapper[4626]: I0223 06:47:05.697451 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 06:47:10 crc kubenswrapper[4626]: I0223 06:47:10.608919 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 06:47:13 crc kubenswrapper[4626]: I0223 06:47:13.439458 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 06:47:14 crc kubenswrapper[4626]: I0223 06:47:14.152022 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 06:47:25 crc kubenswrapper[4626]: I0223 06:47:25.685766 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:47:25 crc kubenswrapper[4626]: I0223 06:47:25.686418 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.685905 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.687869 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.710473 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dvrq9"] Feb 23 06:47:55 crc kubenswrapper[4626]: E0223 06:47:55.710905 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.710933 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.711077 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.711795 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.725766 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dvrq9"] Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.846479 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-registry-tls\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.846943 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/870349d6-17d4-4148-ab88-6db887f850b3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.847086 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870349d6-17d4-4148-ab88-6db887f850b3-trusted-ca\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.847236 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/870349d6-17d4-4148-ab88-6db887f850b3-registry-certificates\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.847341 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.847476 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch2kp\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-kube-api-access-ch2kp\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.847609 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/870349d6-17d4-4148-ab88-6db887f850b3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.847968 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-bound-sa-token\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.870625 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949084 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/870349d6-17d4-4148-ab88-6db887f850b3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949130 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-registry-tls\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949154 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870349d6-17d4-4148-ab88-6db887f850b3-trusted-ca\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949190 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/870349d6-17d4-4148-ab88-6db887f850b3-registry-certificates\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949237 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch2kp\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-kube-api-access-ch2kp\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949257 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/870349d6-17d4-4148-ab88-6db887f850b3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.949296 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-bound-sa-token\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.950761 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/870349d6-17d4-4148-ab88-6db887f850b3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.951168 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/870349d6-17d4-4148-ab88-6db887f850b3-registry-certificates\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.951721 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870349d6-17d4-4148-ab88-6db887f850b3-trusted-ca\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.956903 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-registry-tls\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.957895 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/870349d6-17d4-4148-ab88-6db887f850b3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.963973 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch2kp\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-kube-api-access-ch2kp\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:55 crc kubenswrapper[4626]: I0223 06:47:55.965405 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870349d6-17d4-4148-ab88-6db887f850b3-bound-sa-token\") pod \"image-registry-66df7c8f76-dvrq9\" (UID: \"870349d6-17d4-4148-ab88-6db887f850b3\") " pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:56 crc kubenswrapper[4626]: I0223 06:47:56.028974 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:56 crc kubenswrapper[4626]: I0223 06:47:56.426303 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dvrq9"] Feb 23 06:47:56 crc kubenswrapper[4626]: I0223 06:47:56.539085 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" event={"ID":"870349d6-17d4-4148-ab88-6db887f850b3","Type":"ContainerStarted","Data":"45e91668e9cfd3ac463cb9531daee368adca0c65f3a51856d893040d7f646c66"} Feb 23 06:47:57 crc kubenswrapper[4626]: I0223 06:47:57.548096 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" event={"ID":"870349d6-17d4-4148-ab88-6db887f850b3","Type":"ContainerStarted","Data":"0f77037842371b2507e93736069e4f53cbb508bc0d3b16ee7a6942178866530b"} Feb 23 06:47:57 crc kubenswrapper[4626]: I0223 06:47:57.548614 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:47:57 crc kubenswrapper[4626]: I0223 06:47:57.566481 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" podStartSLOduration=2.566461049 podStartE2EDuration="2.566461049s" podCreationTimestamp="2026-02-23 06:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:57.56403071 +0000 UTC m=+429.903359976" watchObservedRunningTime="2026-02-23 06:47:57.566461049 +0000 UTC m=+429.905790315" Feb 23 06:48:16 crc kubenswrapper[4626]: I0223 06:48:16.037470 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dvrq9" Feb 23 06:48:16 crc kubenswrapper[4626]: I0223 06:48:16.093218 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xgpg7"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.685334 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.685852 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.685961 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.686573 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b478f95f1f86a5654dee4643a7ca0e01deee382932ab3857cea5b660341ebe5"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.686644 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://5b478f95f1f86a5654dee4643a7ca0e01deee382932ab3857cea5b660341ebe5" gracePeriod=600 Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.897591 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swmdj"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.897970 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swmdj" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="registry-server" containerID="cri-o://c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0" gracePeriod=30 Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.909083 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb2fx"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.909319 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wb2fx" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="registry-server" containerID="cri-o://b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9" gracePeriod=30 Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.922103 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vs6nc"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.922320 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" containerID="cri-o://e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929" gracePeriod=30 Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.929461 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvhss"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.929658 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wvhss" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="registry-server" containerID="cri-o://42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372" gracePeriod=30 Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.941603 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rbtx"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.941850 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6rbtx" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="registry-server" containerID="cri-o://03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b" gracePeriod=30 Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.960312 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-222dt"] Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.961236 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.968354 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gkd\" (UniqueName: \"kubernetes.io/projected/34eadfde-147c-470f-bf62-db7e15fbf337-kube-api-access-j5gkd\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.968442 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34eadfde-147c-470f-bf62-db7e15fbf337-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.968462 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34eadfde-147c-470f-bf62-db7e15fbf337-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:25 crc kubenswrapper[4626]: I0223 06:48:25.974924 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-222dt"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.069807 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34eadfde-147c-470f-bf62-db7e15fbf337-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.069851 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34eadfde-147c-470f-bf62-db7e15fbf337-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.069890 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gkd\" (UniqueName: \"kubernetes.io/projected/34eadfde-147c-470f-bf62-db7e15fbf337-kube-api-access-j5gkd\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.072890 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34eadfde-147c-470f-bf62-db7e15fbf337-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.083158 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34eadfde-147c-470f-bf62-db7e15fbf337-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.087044 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gkd\" (UniqueName: \"kubernetes.io/projected/34eadfde-147c-470f-bf62-db7e15fbf337-kube-api-access-j5gkd\") pod \"marketplace-operator-79b997595-222dt\" (UID: \"34eadfde-147c-470f-bf62-db7e15fbf337\") " pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.181810 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.223711 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.272135 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-catalog-content\") pod \"b3028805-3229-4cc3-9e20-2bca252b2c19\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.272195 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-utilities\") pod \"b3028805-3229-4cc3-9e20-2bca252b2c19\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.272223 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7hcb\" (UniqueName: \"kubernetes.io/projected/b3028805-3229-4cc3-9e20-2bca252b2c19-kube-api-access-v7hcb\") pod \"b3028805-3229-4cc3-9e20-2bca252b2c19\" (UID: \"b3028805-3229-4cc3-9e20-2bca252b2c19\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.274214 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-utilities" (OuterVolumeSpecName: "utilities") pod "b3028805-3229-4cc3-9e20-2bca252b2c19" (UID: "b3028805-3229-4cc3-9e20-2bca252b2c19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.278877 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3028805-3229-4cc3-9e20-2bca252b2c19-kube-api-access-v7hcb" (OuterVolumeSpecName: "kube-api-access-v7hcb") pod "b3028805-3229-4cc3-9e20-2bca252b2c19" (UID: "b3028805-3229-4cc3-9e20-2bca252b2c19"). InnerVolumeSpecName "kube-api-access-v7hcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.335218 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3028805-3229-4cc3-9e20-2bca252b2c19" (UID: "b3028805-3229-4cc3-9e20-2bca252b2c19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.374695 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.374733 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3028805-3229-4cc3-9e20-2bca252b2c19-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.374745 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7hcb\" (UniqueName: \"kubernetes.io/projected/b3028805-3229-4cc3-9e20-2bca252b2c19-kube-api-access-v7hcb\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.637243 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.640464 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.647120 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.649661 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679248 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-catalog-content\") pod \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679285 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qlg6\" (UniqueName: \"kubernetes.io/projected/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-kube-api-access-7qlg6\") pod \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679332 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-utilities\") pod \"a1530c30-549e-4d85-b7ee-086832420311\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679364 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-trusted-ca\") pod \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679432 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-catalog-content\") pod \"ad17db5e-dbea-4e58-94b2-6a897f475993\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679468 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-utilities\") pod \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\" (UID: \"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679487 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkjld\" (UniqueName: \"kubernetes.io/projected/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-kube-api-access-lkjld\") pod \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679580 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65kcn\" (UniqueName: \"kubernetes.io/projected/a1530c30-549e-4d85-b7ee-086832420311-kube-api-access-65kcn\") pod \"a1530c30-549e-4d85-b7ee-086832420311\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679612 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-utilities\") pod \"ad17db5e-dbea-4e58-94b2-6a897f475993\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679633 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-operator-metrics\") pod \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\" (UID: \"c98d4d0e-9a53-4b64-a26d-11eda45f90fa\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679654 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzzgw\" (UniqueName: \"kubernetes.io/projected/ad17db5e-dbea-4e58-94b2-6a897f475993-kube-api-access-fzzgw\") pod \"ad17db5e-dbea-4e58-94b2-6a897f475993\" (UID: \"ad17db5e-dbea-4e58-94b2-6a897f475993\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.679673 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-catalog-content\") pod \"a1530c30-549e-4d85-b7ee-086832420311\" (UID: \"a1530c30-549e-4d85-b7ee-086832420311\") " Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.680402 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-utilities" (OuterVolumeSpecName: "utilities") pod "ad17db5e-dbea-4e58-94b2-6a897f475993" (UID: "ad17db5e-dbea-4e58-94b2-6a897f475993"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.682017 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-utilities" (OuterVolumeSpecName: "utilities") pod "a1530c30-549e-4d85-b7ee-086832420311" (UID: "a1530c30-549e-4d85-b7ee-086832420311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.685832 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c98d4d0e-9a53-4b64-a26d-11eda45f90fa" (UID: "c98d4d0e-9a53-4b64-a26d-11eda45f90fa"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.688619 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-utilities" (OuterVolumeSpecName: "utilities") pod "dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" (UID: "dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.691990 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c98d4d0e-9a53-4b64-a26d-11eda45f90fa" (UID: "c98d4d0e-9a53-4b64-a26d-11eda45f90fa"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.695242 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-kube-api-access-lkjld" (OuterVolumeSpecName: "kube-api-access-lkjld") pod "c98d4d0e-9a53-4b64-a26d-11eda45f90fa" (UID: "c98d4d0e-9a53-4b64-a26d-11eda45f90fa"). InnerVolumeSpecName "kube-api-access-lkjld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.696743 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1530c30-549e-4d85-b7ee-086832420311-kube-api-access-65kcn" (OuterVolumeSpecName: "kube-api-access-65kcn") pod "a1530c30-549e-4d85-b7ee-086832420311" (UID: "a1530c30-549e-4d85-b7ee-086832420311"). InnerVolumeSpecName "kube-api-access-65kcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.702647 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-kube-api-access-7qlg6" (OuterVolumeSpecName: "kube-api-access-7qlg6") pod "dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" (UID: "dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5"). InnerVolumeSpecName "kube-api-access-7qlg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.712461 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" (UID: "dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.714637 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad17db5e-dbea-4e58-94b2-6a897f475993-kube-api-access-fzzgw" (OuterVolumeSpecName: "kube-api-access-fzzgw") pod "ad17db5e-dbea-4e58-94b2-6a897f475993" (UID: "ad17db5e-dbea-4e58-94b2-6a897f475993"). InnerVolumeSpecName "kube-api-access-fzzgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.730885 4626 generic.go:334] "Generic (PLEG): container finished" podID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerID="03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.731037 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rbtx" event={"ID":"ad17db5e-dbea-4e58-94b2-6a897f475993","Type":"ContainerDied","Data":"03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.731116 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rbtx" event={"ID":"ad17db5e-dbea-4e58-94b2-6a897f475993","Type":"ContainerDied","Data":"0c981a300349b3220bfa6ae37ebcf75358b8885e637199ccb5a96d06ed4b0ff5"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.731190 4626 scope.go:117] "RemoveContainer" containerID="03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.731357 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rbtx" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.736162 4626 generic.go:334] "Generic (PLEG): container finished" podID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerID="b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.736223 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb2fx" event={"ID":"b3028805-3229-4cc3-9e20-2bca252b2c19","Type":"ContainerDied","Data":"b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.736249 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wb2fx" event={"ID":"b3028805-3229-4cc3-9e20-2bca252b2c19","Type":"ContainerDied","Data":"0faa0efbc074cbb99ce742ae87ca0fd1808829c553459238ee7fa4f1fcc11886"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.736313 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wb2fx" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.741662 4626 generic.go:334] "Generic (PLEG): container finished" podID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerID="e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.741756 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" event={"ID":"c98d4d0e-9a53-4b64-a26d-11eda45f90fa","Type":"ContainerDied","Data":"e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.741824 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" event={"ID":"c98d4d0e-9a53-4b64-a26d-11eda45f90fa","Type":"ContainerDied","Data":"f4599bf751e7ffc79bd41460ce99bcf559107b050fc8a56fb5f65543c0104c46"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.741910 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vs6nc" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.749208 4626 generic.go:334] "Generic (PLEG): container finished" podID="a1530c30-549e-4d85-b7ee-086832420311" containerID="c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.749273 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swmdj" event={"ID":"a1530c30-549e-4d85-b7ee-086832420311","Type":"ContainerDied","Data":"c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.749304 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swmdj" event={"ID":"a1530c30-549e-4d85-b7ee-086832420311","Type":"ContainerDied","Data":"4bfeb70ce0dd8f3e9204f3cc51f4657f13e47f9cf63c077758cd4c51bdb6360e"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.749369 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swmdj" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.749856 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1530c30-549e-4d85-b7ee-086832420311" (UID: "a1530c30-549e-4d85-b7ee-086832420311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.750409 4626 scope.go:117] "RemoveContainer" containerID="0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.752951 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="5b478f95f1f86a5654dee4643a7ca0e01deee382932ab3857cea5b660341ebe5" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.753018 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"5b478f95f1f86a5654dee4643a7ca0e01deee382932ab3857cea5b660341ebe5"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.753065 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"5bd8b825f9633bd9403c87dfeb8220ce56a9e848ab820c72cccea63f1dfee03e"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781439 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkjld\" (UniqueName: \"kubernetes.io/projected/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-kube-api-access-lkjld\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781470 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781481 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65kcn\" (UniqueName: \"kubernetes.io/projected/a1530c30-549e-4d85-b7ee-086832420311-kube-api-access-65kcn\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781511 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781523 4626 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781535 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzzgw\" (UniqueName: \"kubernetes.io/projected/ad17db5e-dbea-4e58-94b2-6a897f475993-kube-api-access-fzzgw\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781544 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781557 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781566 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qlg6\" (UniqueName: \"kubernetes.io/projected/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5-kube-api-access-7qlg6\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781575 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1530c30-549e-4d85-b7ee-086832420311-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.781586 4626 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c98d4d0e-9a53-4b64-a26d-11eda45f90fa-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.789587 4626 generic.go:334] "Generic (PLEG): container finished" podID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerID="42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372" exitCode=0 Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.789631 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvhss" event={"ID":"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5","Type":"ContainerDied","Data":"42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.789660 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvhss" event={"ID":"dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5","Type":"ContainerDied","Data":"b9dd0c3a15c251341ad4364af13a0f16faac424b6a02e93ed9e600d740a4425c"} Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.789750 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvhss" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.802791 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vs6nc"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.813116 4626 scope.go:117] "RemoveContainer" containerID="5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.822597 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vs6nc"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.828373 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wb2fx"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.831024 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-222dt"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.841415 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wb2fx"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.852595 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad17db5e-dbea-4e58-94b2-6a897f475993" (UID: "ad17db5e-dbea-4e58-94b2-6a897f475993"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.855331 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvhss"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.859183 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvhss"] Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.868773 4626 scope.go:117] "RemoveContainer" containerID="03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.869699 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b\": container with ID starting with 03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b not found: ID does not exist" containerID="03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.869734 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b"} err="failed to get container status \"03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b\": rpc error: code = NotFound desc = could not find container \"03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b\": container with ID starting with 03e8c2196c84adecb353c83ead41d945637b3a8975f9bf2ecf48378860e4d99b not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.869758 4626 scope.go:117] "RemoveContainer" containerID="0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.870821 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f\": container with ID starting with 0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f not found: ID does not exist" containerID="0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.870858 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f"} err="failed to get container status \"0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f\": rpc error: code = NotFound desc = could not find container \"0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f\": container with ID starting with 0c131331bf3c9894ea81ca64fd1fae08401b5a00f6132fc2ebbf96b3a7fea22f not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.870884 4626 scope.go:117] "RemoveContainer" containerID="5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.871383 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4\": container with ID starting with 5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4 not found: ID does not exist" containerID="5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.871406 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4"} err="failed to get container status \"5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4\": rpc error: code = NotFound desc = could not find container \"5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4\": container with ID starting with 5f0f75dcd2ad09ed6efe4ea5eeb70dfe9eed98743fb59a76b8c7dfb6a4b1e5a4 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.871420 4626 scope.go:117] "RemoveContainer" containerID="b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.882285 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad17db5e-dbea-4e58-94b2-6a897f475993-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.883569 4626 scope.go:117] "RemoveContainer" containerID="57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.905186 4626 scope.go:117] "RemoveContainer" containerID="dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.930147 4626 scope.go:117] "RemoveContainer" containerID="b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.930559 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9\": container with ID starting with b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9 not found: ID does not exist" containerID="b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.930590 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9"} err="failed to get container status \"b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9\": rpc error: code = NotFound desc = could not find container \"b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9\": container with ID starting with b5517ee90142411414af59bf57a806f559f01cdfbb53666c9ff9e913f6c373f9 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.930609 4626 scope.go:117] "RemoveContainer" containerID="57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.930912 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8\": container with ID starting with 57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8 not found: ID does not exist" containerID="57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.930975 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8"} err="failed to get container status \"57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8\": rpc error: code = NotFound desc = could not find container \"57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8\": container with ID starting with 57e751a6dcc4ea8bab382666f98aa8aaba4dd1c26e2767991afb395cdbcb10b8 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.931004 4626 scope.go:117] "RemoveContainer" containerID="dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.931319 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08\": container with ID starting with dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08 not found: ID does not exist" containerID="dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.931344 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08"} err="failed to get container status \"dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08\": rpc error: code = NotFound desc = could not find container \"dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08\": container with ID starting with dd405a6f02c9fdc77d7eedc8d1d3f317fea8a68204491f508669c08d4bc64c08 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.931359 4626 scope.go:117] "RemoveContainer" containerID="e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.945733 4626 scope.go:117] "RemoveContainer" containerID="a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.964818 4626 scope.go:117] "RemoveContainer" containerID="e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.965290 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929\": container with ID starting with e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929 not found: ID does not exist" containerID="e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.965347 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929"} err="failed to get container status \"e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929\": rpc error: code = NotFound desc = could not find container \"e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929\": container with ID starting with e4afd70ce62efeb3b6ee28464cb8c41bf38cfd64cc8ab6fa47a21e12ac882929 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.965383 4626 scope.go:117] "RemoveContainer" containerID="a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189" Feb 23 06:48:26 crc kubenswrapper[4626]: E0223 06:48:26.967099 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189\": container with ID starting with a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189 not found: ID does not exist" containerID="a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.967139 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189"} err="failed to get container status \"a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189\": rpc error: code = NotFound desc = could not find container \"a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189\": container with ID starting with a9164fbd393591ea1bde1e9e72bce45c234b2b96fe625e4288bc19175b464189 not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.967183 4626 scope.go:117] "RemoveContainer" containerID="c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0" Feb 23 06:48:26 crc kubenswrapper[4626]: I0223 06:48:26.985040 4626 scope.go:117] "RemoveContainer" containerID="40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.002564 4626 scope.go:117] "RemoveContainer" containerID="82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.015305 4626 scope.go:117] "RemoveContainer" containerID="c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.015891 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0\": container with ID starting with c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0 not found: ID does not exist" containerID="c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.015924 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0"} err="failed to get container status \"c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0\": rpc error: code = NotFound desc = could not find container \"c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0\": container with ID starting with c0be4d396cddabdbeeda047864ce6ba8c055d083d8c7c074098109f5c8b1deb0 not found: ID does not exist" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.015948 4626 scope.go:117] "RemoveContainer" containerID="40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.016544 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86\": container with ID starting with 40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86 not found: ID does not exist" containerID="40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.016595 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86"} err="failed to get container status \"40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86\": rpc error: code = NotFound desc = could not find container \"40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86\": container with ID starting with 40e97828b854e9d0648d4e02d0c9d9baf3cca859aa08463c241b0b350d76cc86 not found: ID does not exist" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.016619 4626 scope.go:117] "RemoveContainer" containerID="82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.017483 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4\": container with ID starting with 82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4 not found: ID does not exist" containerID="82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.017536 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4"} err="failed to get container status \"82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4\": rpc error: code = NotFound desc = could not find container \"82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4\": container with ID starting with 82604a3929fab8184146427289ee60bf9a4b819596d814c0167a00f03dd314c4 not found: ID does not exist" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.017556 4626 scope.go:117] "RemoveContainer" containerID="4278639261ec4c69e03244228687a082e6de589cab1662d3c1aa7bd4a526f0f8" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.034566 4626 scope.go:117] "RemoveContainer" containerID="42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.057615 4626 scope.go:117] "RemoveContainer" containerID="80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.065035 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rbtx"] Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.070567 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6rbtx"] Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.079584 4626 scope.go:117] "RemoveContainer" containerID="54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.093482 4626 scope.go:117] "RemoveContainer" containerID="42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.094052 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372\": container with ID starting with 42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372 not found: ID does not exist" containerID="42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.094098 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372"} err="failed to get container status \"42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372\": rpc error: code = NotFound desc = could not find container \"42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372\": container with ID starting with 42cf56976687f299411f2db3bc10832e8a9e1c6ad88637e20fac2cfd36f53372 not found: ID does not exist" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.094125 4626 scope.go:117] "RemoveContainer" containerID="80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.094408 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893\": container with ID starting with 80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893 not found: ID does not exist" containerID="80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.094456 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893"} err="failed to get container status \"80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893\": rpc error: code = NotFound desc = could not find container \"80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893\": container with ID starting with 80fb85d7982984dbd2c4088bb73161fcbc88c4b1e94dd5216a1209d9dcb0b893 not found: ID does not exist" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.094479 4626 scope.go:117] "RemoveContainer" containerID="54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.094776 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092\": container with ID starting with 54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092 not found: ID does not exist" containerID="54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.094809 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092"} err="failed to get container status \"54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092\": rpc error: code = NotFound desc = could not find container \"54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092\": container with ID starting with 54410689275fe220cff0c506479225f5e8ede5f8ca60883f950de1a0b8bbb092 not found: ID does not exist" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.095038 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swmdj"] Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.098762 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swmdj"] Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717273 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fpznx"] Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717736 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717754 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717762 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717768 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717775 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717780 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717786 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717791 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717806 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717812 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717821 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717826 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717835 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717840 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717847 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717853 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717860 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717865 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717873 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717878 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717885 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717889 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="extract-content" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717898 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717903 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717913 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717918 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="extract-utilities" Feb 23 06:48:27 crc kubenswrapper[4626]: E0223 06:48:27.717929 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.717935 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.718022 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.718031 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.718039 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.718047 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.718055 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1530c30-549e-4d85-b7ee-086832420311" containerName="registry-server" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.718200 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" containerName="marketplace-operator" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.719310 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.722065 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.732973 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fpznx"] Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.793894 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mvn\" (UniqueName: \"kubernetes.io/projected/0e583bb8-92ca-41bc-bdf4-82032820e1ff-kube-api-access-m7mvn\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.794075 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-catalog-content\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.794170 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-utilities\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.818102 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" event={"ID":"34eadfde-147c-470f-bf62-db7e15fbf337","Type":"ContainerStarted","Data":"5a5f7f1573bb01db748e205cc6d31c1e9c4ae378a2791b8cea73aba4914f34de"} Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.818146 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" event={"ID":"34eadfde-147c-470f-bf62-db7e15fbf337","Type":"ContainerStarted","Data":"b26d1a9142e9571ae5b9bfde8ffe92c7e96269b7b701b9aa12ea2a3a130b94d5"} Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.818661 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.821320 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.839005 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-222dt" podStartSLOduration=2.838981328 podStartE2EDuration="2.838981328s" podCreationTimestamp="2026-02-23 06:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:27.831476644 +0000 UTC m=+460.170805901" watchObservedRunningTime="2026-02-23 06:48:27.838981328 +0000 UTC m=+460.178310595" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.896044 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-catalog-content\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.896088 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-utilities\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.896204 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mvn\" (UniqueName: \"kubernetes.io/projected/0e583bb8-92ca-41bc-bdf4-82032820e1ff-kube-api-access-m7mvn\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.896703 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-catalog-content\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.896718 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-utilities\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.916106 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mvn\" (UniqueName: \"kubernetes.io/projected/0e583bb8-92ca-41bc-bdf4-82032820e1ff-kube-api-access-m7mvn\") pod \"certified-operators-fpznx\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.990021 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1530c30-549e-4d85-b7ee-086832420311" path="/var/lib/kubelet/pods/a1530c30-549e-4d85-b7ee-086832420311/volumes" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.990716 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad17db5e-dbea-4e58-94b2-6a897f475993" path="/var/lib/kubelet/pods/ad17db5e-dbea-4e58-94b2-6a897f475993/volumes" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.991415 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3028805-3229-4cc3-9e20-2bca252b2c19" path="/var/lib/kubelet/pods/b3028805-3229-4cc3-9e20-2bca252b2c19/volumes" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.992731 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98d4d0e-9a53-4b64-a26d-11eda45f90fa" path="/var/lib/kubelet/pods/c98d4d0e-9a53-4b64-a26d-11eda45f90fa/volumes" Feb 23 06:48:27 crc kubenswrapper[4626]: I0223 06:48:27.993265 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5" path="/var/lib/kubelet/pods/dae4cf51-a44a-4e26-9e7e-ac0ca8c797c5/volumes" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.037339 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.417519 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fpznx"] Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.726433 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dvgms"] Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.729459 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.734752 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.748842 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvgms"] Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.812328 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe4020a-d6b4-48ac-93bd-9afc54de6668-utilities\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.812599 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jt4\" (UniqueName: \"kubernetes.io/projected/cbe4020a-d6b4-48ac-93bd-9afc54de6668-kube-api-access-72jt4\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.812635 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe4020a-d6b4-48ac-93bd-9afc54de6668-catalog-content\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.832663 4626 generic.go:334] "Generic (PLEG): container finished" podID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerID="7f59624a371cfbc5c1c954567e1247e354d2ed15c94f6b17b9f97f936e9c3fa2" exitCode=0 Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.834087 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpznx" event={"ID":"0e583bb8-92ca-41bc-bdf4-82032820e1ff","Type":"ContainerDied","Data":"7f59624a371cfbc5c1c954567e1247e354d2ed15c94f6b17b9f97f936e9c3fa2"} Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.834130 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpznx" event={"ID":"0e583bb8-92ca-41bc-bdf4-82032820e1ff","Type":"ContainerStarted","Data":"16eaf2654483cdb31fbaaa978274f5dff20a9cfe32045bfd66ecfe312879de15"} Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.913970 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe4020a-d6b4-48ac-93bd-9afc54de6668-utilities\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.914014 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jt4\" (UniqueName: \"kubernetes.io/projected/cbe4020a-d6b4-48ac-93bd-9afc54de6668-kube-api-access-72jt4\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.914064 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe4020a-d6b4-48ac-93bd-9afc54de6668-catalog-content\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.914659 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbe4020a-d6b4-48ac-93bd-9afc54de6668-catalog-content\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.916196 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbe4020a-d6b4-48ac-93bd-9afc54de6668-utilities\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:28 crc kubenswrapper[4626]: I0223 06:48:28.934530 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jt4\" (UniqueName: \"kubernetes.io/projected/cbe4020a-d6b4-48ac-93bd-9afc54de6668-kube-api-access-72jt4\") pod \"redhat-marketplace-dvgms\" (UID: \"cbe4020a-d6b4-48ac-93bd-9afc54de6668\") " pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:29 crc kubenswrapper[4626]: I0223 06:48:29.055492 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:29 crc kubenswrapper[4626]: I0223 06:48:29.454541 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvgms"] Feb 23 06:48:29 crc kubenswrapper[4626]: I0223 06:48:29.841289 4626 generic.go:334] "Generic (PLEG): container finished" podID="cbe4020a-d6b4-48ac-93bd-9afc54de6668" containerID="6df0b08d719d78f1cfd4462b0df45004c50aab09074a81c20e2816db28c39357" exitCode=0 Feb 23 06:48:29 crc kubenswrapper[4626]: I0223 06:48:29.841391 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvgms" event={"ID":"cbe4020a-d6b4-48ac-93bd-9afc54de6668","Type":"ContainerDied","Data":"6df0b08d719d78f1cfd4462b0df45004c50aab09074a81c20e2816db28c39357"} Feb 23 06:48:29 crc kubenswrapper[4626]: I0223 06:48:29.841593 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvgms" event={"ID":"cbe4020a-d6b4-48ac-93bd-9afc54de6668","Type":"ContainerStarted","Data":"b4f27a9a0bf2ec163c0a363709187a03882b3e95e9cbb9a07b392059eabd73fb"} Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.113691 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sll72"] Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.114665 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.121207 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.126291 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sll72"] Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.133190 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvlbp\" (UniqueName: \"kubernetes.io/projected/e6570124-6817-4052-89d0-179b1556ea3e-kube-api-access-vvlbp\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.133232 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6570124-6817-4052-89d0-179b1556ea3e-utilities\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.133251 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6570124-6817-4052-89d0-179b1556ea3e-catalog-content\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.234023 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvlbp\" (UniqueName: \"kubernetes.io/projected/e6570124-6817-4052-89d0-179b1556ea3e-kube-api-access-vvlbp\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.234062 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6570124-6817-4052-89d0-179b1556ea3e-utilities\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.234082 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6570124-6817-4052-89d0-179b1556ea3e-catalog-content\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.234572 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6570124-6817-4052-89d0-179b1556ea3e-catalog-content\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.234769 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6570124-6817-4052-89d0-179b1556ea3e-utilities\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.256679 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvlbp\" (UniqueName: \"kubernetes.io/projected/e6570124-6817-4052-89d0-179b1556ea3e-kube-api-access-vvlbp\") pod \"redhat-operators-sll72\" (UID: \"e6570124-6817-4052-89d0-179b1556ea3e\") " pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.485118 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.696730 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sll72"] Feb 23 06:48:30 crc kubenswrapper[4626]: W0223 06:48:30.706042 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6570124_6817_4052_89d0_179b1556ea3e.slice/crio-9e701e08853c2bea75d2611635f8d88da44373f2e371ebd8141054e13b0a2553 WatchSource:0}: Error finding container 9e701e08853c2bea75d2611635f8d88da44373f2e371ebd8141054e13b0a2553: Status 404 returned error can't find the container with id 9e701e08853c2bea75d2611635f8d88da44373f2e371ebd8141054e13b0a2553 Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.859531 4626 generic.go:334] "Generic (PLEG): container finished" podID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerID="ffcb7474aaa3ad06ba88b798c87579815df7dccc3d57f4d945a94985a910c9c5" exitCode=0 Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.859601 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpznx" event={"ID":"0e583bb8-92ca-41bc-bdf4-82032820e1ff","Type":"ContainerDied","Data":"ffcb7474aaa3ad06ba88b798c87579815df7dccc3d57f4d945a94985a910c9c5"} Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.881122 4626 generic.go:334] "Generic (PLEG): container finished" podID="cbe4020a-d6b4-48ac-93bd-9afc54de6668" containerID="de1ba3e2a4caac0860c8c545ba4f20778869c51de694c1186374a74860a025c0" exitCode=0 Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.881229 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvgms" event={"ID":"cbe4020a-d6b4-48ac-93bd-9afc54de6668","Type":"ContainerDied","Data":"de1ba3e2a4caac0860c8c545ba4f20778869c51de694c1186374a74860a025c0"} Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.884089 4626 generic.go:334] "Generic (PLEG): container finished" podID="e6570124-6817-4052-89d0-179b1556ea3e" containerID="0e3e9f4e544c33d7b809b95e2de566d572f1a1006b15612006d07f5c0445db8e" exitCode=0 Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.884130 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sll72" event={"ID":"e6570124-6817-4052-89d0-179b1556ea3e","Type":"ContainerDied","Data":"0e3e9f4e544c33d7b809b95e2de566d572f1a1006b15612006d07f5c0445db8e"} Feb 23 06:48:30 crc kubenswrapper[4626]: I0223 06:48:30.884154 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sll72" event={"ID":"e6570124-6817-4052-89d0-179b1556ea3e","Type":"ContainerStarted","Data":"9e701e08853c2bea75d2611635f8d88da44373f2e371ebd8141054e13b0a2553"} Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.123340 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fw4n6"] Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.125364 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.128202 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.150662 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fw4n6"] Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.252281 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp6bq\" (UniqueName: \"kubernetes.io/projected/330ef0cc-5cfd-445b-ab4a-76df383091f0-kube-api-access-lp6bq\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.252367 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330ef0cc-5cfd-445b-ab4a-76df383091f0-catalog-content\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.252455 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330ef0cc-5cfd-445b-ab4a-76df383091f0-utilities\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.353381 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330ef0cc-5cfd-445b-ab4a-76df383091f0-catalog-content\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.353421 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330ef0cc-5cfd-445b-ab4a-76df383091f0-utilities\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.353472 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp6bq\" (UniqueName: \"kubernetes.io/projected/330ef0cc-5cfd-445b-ab4a-76df383091f0-kube-api-access-lp6bq\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.354004 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330ef0cc-5cfd-445b-ab4a-76df383091f0-catalog-content\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.354213 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330ef0cc-5cfd-445b-ab4a-76df383091f0-utilities\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.385094 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp6bq\" (UniqueName: \"kubernetes.io/projected/330ef0cc-5cfd-445b-ab4a-76df383091f0-kube-api-access-lp6bq\") pod \"community-operators-fw4n6\" (UID: \"330ef0cc-5cfd-445b-ab4a-76df383091f0\") " pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.441403 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.626111 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fw4n6"] Feb 23 06:48:31 crc kubenswrapper[4626]: W0223 06:48:31.634438 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330ef0cc_5cfd_445b_ab4a_76df383091f0.slice/crio-2f9907a61dd2a36c48330da55e8c180d6e1ce0673e28c649be8c5a8ed9baf30e WatchSource:0}: Error finding container 2f9907a61dd2a36c48330da55e8c180d6e1ce0673e28c649be8c5a8ed9baf30e: Status 404 returned error can't find the container with id 2f9907a61dd2a36c48330da55e8c180d6e1ce0673e28c649be8c5a8ed9baf30e Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.892582 4626 generic.go:334] "Generic (PLEG): container finished" podID="330ef0cc-5cfd-445b-ab4a-76df383091f0" containerID="a3a601f2a1032d9a7c5d3f0d663aa520ad4507375f936925d9855aaea2aa5b49" exitCode=0 Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.892653 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw4n6" event={"ID":"330ef0cc-5cfd-445b-ab4a-76df383091f0","Type":"ContainerDied","Data":"a3a601f2a1032d9a7c5d3f0d663aa520ad4507375f936925d9855aaea2aa5b49"} Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.892686 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw4n6" event={"ID":"330ef0cc-5cfd-445b-ab4a-76df383091f0","Type":"ContainerStarted","Data":"2f9907a61dd2a36c48330da55e8c180d6e1ce0673e28c649be8c5a8ed9baf30e"} Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.895627 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpznx" event={"ID":"0e583bb8-92ca-41bc-bdf4-82032820e1ff","Type":"ContainerStarted","Data":"f576fedf877826fb0e97e31350b22aba1c72ff67d25171bdf9a417159566975b"} Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.900296 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvgms" event={"ID":"cbe4020a-d6b4-48ac-93bd-9afc54de6668","Type":"ContainerStarted","Data":"b14635220cb3e7ed13842b1c5397c86f4607dc4d8d663c0470c59c92f0d2071d"} Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.902321 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sll72" event={"ID":"e6570124-6817-4052-89d0-179b1556ea3e","Type":"ContainerStarted","Data":"81d664119fe3e2e7aab028768c6336e5072f1ef2de1c6934566fb1a0af6205a9"} Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.941545 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fpznx" podStartSLOduration=2.429382896 podStartE2EDuration="4.94152675s" podCreationTimestamp="2026-02-23 06:48:27 +0000 UTC" firstStartedPulling="2026-02-23 06:48:28.835590477 +0000 UTC m=+461.174919743" lastFinishedPulling="2026-02-23 06:48:31.34773433 +0000 UTC m=+463.687063597" observedRunningTime="2026-02-23 06:48:31.940972075 +0000 UTC m=+464.280301341" watchObservedRunningTime="2026-02-23 06:48:31.94152675 +0000 UTC m=+464.280856016" Feb 23 06:48:31 crc kubenswrapper[4626]: I0223 06:48:31.955153 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dvgms" podStartSLOduration=2.407793496 podStartE2EDuration="3.955130429s" podCreationTimestamp="2026-02-23 06:48:28 +0000 UTC" firstStartedPulling="2026-02-23 06:48:29.843177715 +0000 UTC m=+462.182506981" lastFinishedPulling="2026-02-23 06:48:31.390514648 +0000 UTC m=+463.729843914" observedRunningTime="2026-02-23 06:48:31.95284434 +0000 UTC m=+464.292173606" watchObservedRunningTime="2026-02-23 06:48:31.955130429 +0000 UTC m=+464.294459695" Feb 23 06:48:32 crc kubenswrapper[4626]: I0223 06:48:32.909544 4626 generic.go:334] "Generic (PLEG): container finished" podID="e6570124-6817-4052-89d0-179b1556ea3e" containerID="81d664119fe3e2e7aab028768c6336e5072f1ef2de1c6934566fb1a0af6205a9" exitCode=0 Feb 23 06:48:32 crc kubenswrapper[4626]: I0223 06:48:32.909667 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sll72" event={"ID":"e6570124-6817-4052-89d0-179b1556ea3e","Type":"ContainerDied","Data":"81d664119fe3e2e7aab028768c6336e5072f1ef2de1c6934566fb1a0af6205a9"} Feb 23 06:48:33 crc kubenswrapper[4626]: I0223 06:48:33.922519 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sll72" event={"ID":"e6570124-6817-4052-89d0-179b1556ea3e","Type":"ContainerStarted","Data":"367175aec1273e14551e7b1bc7affc682c5939301ea1c88fd738ac0cd69475af"} Feb 23 06:48:33 crc kubenswrapper[4626]: I0223 06:48:33.924981 4626 generic.go:334] "Generic (PLEG): container finished" podID="330ef0cc-5cfd-445b-ab4a-76df383091f0" containerID="fdc23a20660dd204bbe777ab3f6c6b3fd2d335a74c237126c85d24190e60fc3c" exitCode=0 Feb 23 06:48:33 crc kubenswrapper[4626]: I0223 06:48:33.925053 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw4n6" event={"ID":"330ef0cc-5cfd-445b-ab4a-76df383091f0","Type":"ContainerDied","Data":"fdc23a20660dd204bbe777ab3f6c6b3fd2d335a74c237126c85d24190e60fc3c"} Feb 23 06:48:33 crc kubenswrapper[4626]: I0223 06:48:33.949022 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sll72" podStartSLOduration=1.419043416 podStartE2EDuration="3.948998629s" podCreationTimestamp="2026-02-23 06:48:30 +0000 UTC" firstStartedPulling="2026-02-23 06:48:30.887452627 +0000 UTC m=+463.226781894" lastFinishedPulling="2026-02-23 06:48:33.41740784 +0000 UTC m=+465.756737107" observedRunningTime="2026-02-23 06:48:33.942199525 +0000 UTC m=+466.281528790" watchObservedRunningTime="2026-02-23 06:48:33.948998629 +0000 UTC m=+466.288327896" Feb 23 06:48:35 crc kubenswrapper[4626]: I0223 06:48:35.942590 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fw4n6" event={"ID":"330ef0cc-5cfd-445b-ab4a-76df383091f0","Type":"ContainerStarted","Data":"988a056ecc5747575e0092ad750501299d46022db7a623949a47bdac707cf2e8"} Feb 23 06:48:35 crc kubenswrapper[4626]: I0223 06:48:35.966941 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fw4n6" podStartSLOduration=2.469087229 podStartE2EDuration="4.966921184s" podCreationTimestamp="2026-02-23 06:48:31 +0000 UTC" firstStartedPulling="2026-02-23 06:48:31.893752981 +0000 UTC m=+464.233082246" lastFinishedPulling="2026-02-23 06:48:34.391586935 +0000 UTC m=+466.730916201" observedRunningTime="2026-02-23 06:48:35.964460538 +0000 UTC m=+468.303789804" watchObservedRunningTime="2026-02-23 06:48:35.966921184 +0000 UTC m=+468.306250451" Feb 23 06:48:38 crc kubenswrapper[4626]: I0223 06:48:38.038237 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:38 crc kubenswrapper[4626]: I0223 06:48:38.038715 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:38 crc kubenswrapper[4626]: I0223 06:48:38.085330 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:38 crc kubenswrapper[4626]: I0223 06:48:38.993164 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 06:48:39 crc kubenswrapper[4626]: I0223 06:48:39.056454 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:39 crc kubenswrapper[4626]: I0223 06:48:39.056524 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:39 crc kubenswrapper[4626]: I0223 06:48:39.088428 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:39 crc kubenswrapper[4626]: I0223 06:48:39.998732 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dvgms" Feb 23 06:48:40 crc kubenswrapper[4626]: I0223 06:48:40.485832 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:40 crc kubenswrapper[4626]: I0223 06:48:40.486299 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:40 crc kubenswrapper[4626]: I0223 06:48:40.530575 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.008223 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sll72" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.119366 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" podUID="7e5e4e49-7bad-4e99-a662-b0f4ca041477" containerName="registry" containerID="cri-o://395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd" gracePeriod=30 Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.422598 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.444643 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.445054 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.482688 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614345 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-tls\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614426 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-bound-sa-token\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614556 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc9xj\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-kube-api-access-cc9xj\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614585 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-trusted-ca\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614618 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e5e4e49-7bad-4e99-a662-b0f4ca041477-ca-trust-extracted\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614667 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-certificates\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614876 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.614938 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e5e4e49-7bad-4e99-a662-b0f4ca041477-installation-pull-secrets\") pod \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\" (UID: \"7e5e4e49-7bad-4e99-a662-b0f4ca041477\") " Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.615539 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.615614 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.625140 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.625599 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5e4e49-7bad-4e99-a662-b0f4ca041477-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.625717 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.625732 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-kube-api-access-cc9xj" (OuterVolumeSpecName: "kube-api-access-cc9xj") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "kube-api-access-cc9xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.627287 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.632019 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e5e4e49-7bad-4e99-a662-b0f4ca041477-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7e5e4e49-7bad-4e99-a662-b0f4ca041477" (UID: "7e5e4e49-7bad-4e99-a662-b0f4ca041477"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716492 4626 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716545 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc9xj\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-kube-api-access-cc9xj\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716563 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716573 4626 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e5e4e49-7bad-4e99-a662-b0f4ca041477-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716586 4626 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716597 4626 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e5e4e49-7bad-4e99-a662-b0f4ca041477-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.716606 4626 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e5e4e49-7bad-4e99-a662-b0f4ca041477-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.980038 4626 generic.go:334] "Generic (PLEG): container finished" podID="7e5e4e49-7bad-4e99-a662-b0f4ca041477" containerID="395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd" exitCode=0 Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.980129 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.980135 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" event={"ID":"7e5e4e49-7bad-4e99-a662-b0f4ca041477","Type":"ContainerDied","Data":"395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd"} Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.980203 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xgpg7" event={"ID":"7e5e4e49-7bad-4e99-a662-b0f4ca041477","Type":"ContainerDied","Data":"da776f2d23e6183ae26feda02b1e21899237587657bfda4b02531ad25964ab9b"} Feb 23 06:48:41 crc kubenswrapper[4626]: I0223 06:48:41.980234 4626 scope.go:117] "RemoveContainer" containerID="395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd" Feb 23 06:48:42 crc kubenswrapper[4626]: I0223 06:48:42.002853 4626 scope.go:117] "RemoveContainer" containerID="395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd" Feb 23 06:48:42 crc kubenswrapper[4626]: E0223 06:48:42.003305 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd\": container with ID starting with 395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd not found: ID does not exist" containerID="395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd" Feb 23 06:48:42 crc kubenswrapper[4626]: I0223 06:48:42.003352 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd"} err="failed to get container status \"395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd\": rpc error: code = NotFound desc = could not find container \"395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd\": container with ID starting with 395cfd7a03d04f1b7756806b7db5002291ac944261d085591fcfeb1929ad32fd not found: ID does not exist" Feb 23 06:48:42 crc kubenswrapper[4626]: I0223 06:48:42.019729 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fw4n6" Feb 23 06:48:42 crc kubenswrapper[4626]: I0223 06:48:42.023657 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xgpg7"] Feb 23 06:48:42 crc kubenswrapper[4626]: I0223 06:48:42.032486 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xgpg7"] Feb 23 06:48:43 crc kubenswrapper[4626]: I0223 06:48:43.989559 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e5e4e49-7bad-4e99-a662-b0f4ca041477" path="/var/lib/kubelet/pods/7e5e4e49-7bad-4e99-a662-b0f4ca041477/volumes" Feb 23 06:50:25 crc kubenswrapper[4626]: I0223 06:50:25.685153 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:50:25 crc kubenswrapper[4626]: I0223 06:50:25.685822 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:50:55 crc kubenswrapper[4626]: I0223 06:50:55.685453 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:50:55 crc kubenswrapper[4626]: I0223 06:50:55.686277 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.685290 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.685917 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.685975 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.686469 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5bd8b825f9633bd9403c87dfeb8220ce56a9e848ab820c72cccea63f1dfee03e"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.686546 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://5bd8b825f9633bd9403c87dfeb8220ce56a9e848ab820c72cccea63f1dfee03e" gracePeriod=600 Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.957308 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="5bd8b825f9633bd9403c87dfeb8220ce56a9e848ab820c72cccea63f1dfee03e" exitCode=0 Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.957392 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"5bd8b825f9633bd9403c87dfeb8220ce56a9e848ab820c72cccea63f1dfee03e"} Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.957673 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"e82542f36ca6c092b0099e501573dbf83452c1447d038b617beda611bf3799cf"} Feb 23 06:51:25 crc kubenswrapper[4626]: I0223 06:51:25.957701 4626 scope.go:117] "RemoveContainer" containerID="5b478f95f1f86a5654dee4643a7ca0e01deee382932ab3857cea5b660341ebe5" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.407292 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4"] Feb 23 06:52:51 crc kubenswrapper[4626]: E0223 06:52:51.408391 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5e4e49-7bad-4e99-a662-b0f4ca041477" containerName="registry" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.408413 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5e4e49-7bad-4e99-a662-b0f4ca041477" containerName="registry" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.408553 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5e4e49-7bad-4e99-a662-b0f4ca041477" containerName="registry" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.409068 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.413130 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-mlgd2"] Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.414080 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mlgd2" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.416663 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.416715 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.416933 4626 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2h6v4" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.423104 4626 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hxcmx" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.431434 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-66srh"] Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.432659 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.435799 4626 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-22r9f" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.440045 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mlgd2"] Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.465369 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-66srh"] Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.493536 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4"] Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.497243 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld64p\" (UniqueName: \"kubernetes.io/projected/ab37c078-9524-4dd2-b8f3-450a17f5255d-kube-api-access-ld64p\") pod \"cert-manager-858654f9db-mlgd2\" (UID: \"ab37c078-9524-4dd2-b8f3-450a17f5255d\") " pod="cert-manager/cert-manager-858654f9db-mlgd2" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.497619 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfs9\" (UniqueName: \"kubernetes.io/projected/4f172a38-4eaf-468a-99d8-99416128eef9-kube-api-access-crfs9\") pod \"cert-manager-webhook-687f57d79b-66srh\" (UID: \"4f172a38-4eaf-468a-99d8-99416128eef9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.497707 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb84w\" (UniqueName: \"kubernetes.io/projected/c7d90dff-0264-49d6-9d9e-ed5063ee6976-kube-api-access-sb84w\") pod \"cert-manager-cainjector-cf98fcc89-r5bt4\" (UID: \"c7d90dff-0264-49d6-9d9e-ed5063ee6976\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.598951 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crfs9\" (UniqueName: \"kubernetes.io/projected/4f172a38-4eaf-468a-99d8-99416128eef9-kube-api-access-crfs9\") pod \"cert-manager-webhook-687f57d79b-66srh\" (UID: \"4f172a38-4eaf-468a-99d8-99416128eef9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.599028 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb84w\" (UniqueName: \"kubernetes.io/projected/c7d90dff-0264-49d6-9d9e-ed5063ee6976-kube-api-access-sb84w\") pod \"cert-manager-cainjector-cf98fcc89-r5bt4\" (UID: \"c7d90dff-0264-49d6-9d9e-ed5063ee6976\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.599076 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld64p\" (UniqueName: \"kubernetes.io/projected/ab37c078-9524-4dd2-b8f3-450a17f5255d-kube-api-access-ld64p\") pod \"cert-manager-858654f9db-mlgd2\" (UID: \"ab37c078-9524-4dd2-b8f3-450a17f5255d\") " pod="cert-manager/cert-manager-858654f9db-mlgd2" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.618661 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld64p\" (UniqueName: \"kubernetes.io/projected/ab37c078-9524-4dd2-b8f3-450a17f5255d-kube-api-access-ld64p\") pod \"cert-manager-858654f9db-mlgd2\" (UID: \"ab37c078-9524-4dd2-b8f3-450a17f5255d\") " pod="cert-manager/cert-manager-858654f9db-mlgd2" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.618676 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb84w\" (UniqueName: \"kubernetes.io/projected/c7d90dff-0264-49d6-9d9e-ed5063ee6976-kube-api-access-sb84w\") pod \"cert-manager-cainjector-cf98fcc89-r5bt4\" (UID: \"c7d90dff-0264-49d6-9d9e-ed5063ee6976\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.619378 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfs9\" (UniqueName: \"kubernetes.io/projected/4f172a38-4eaf-468a-99d8-99416128eef9-kube-api-access-crfs9\") pod \"cert-manager-webhook-687f57d79b-66srh\" (UID: \"4f172a38-4eaf-468a-99d8-99416128eef9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.729195 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.735604 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mlgd2" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.745990 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.960815 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4"] Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.973294 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:52:51 crc kubenswrapper[4626]: W0223 06:52:51.997203 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f172a38_4eaf_468a_99d8_99416128eef9.slice/crio-8e76a5a0ff29c56022a7dcce191a13d63a8690d8c589048b7f4cbb34de83a37c WatchSource:0}: Error finding container 8e76a5a0ff29c56022a7dcce191a13d63a8690d8c589048b7f4cbb34de83a37c: Status 404 returned error can't find the container with id 8e76a5a0ff29c56022a7dcce191a13d63a8690d8c589048b7f4cbb34de83a37c Feb 23 06:52:51 crc kubenswrapper[4626]: I0223 06:52:51.998289 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-66srh"] Feb 23 06:52:52 crc kubenswrapper[4626]: I0223 06:52:52.139227 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mlgd2"] Feb 23 06:52:52 crc kubenswrapper[4626]: W0223 06:52:52.143028 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab37c078_9524_4dd2_b8f3_450a17f5255d.slice/crio-285d54a8fbd4a6d6520ae78a7fdd93f8bd04e740591ef26280c023ddb33182b3 WatchSource:0}: Error finding container 285d54a8fbd4a6d6520ae78a7fdd93f8bd04e740591ef26280c023ddb33182b3: Status 404 returned error can't find the container with id 285d54a8fbd4a6d6520ae78a7fdd93f8bd04e740591ef26280c023ddb33182b3 Feb 23 06:52:52 crc kubenswrapper[4626]: I0223 06:52:52.472040 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mlgd2" event={"ID":"ab37c078-9524-4dd2-b8f3-450a17f5255d","Type":"ContainerStarted","Data":"285d54a8fbd4a6d6520ae78a7fdd93f8bd04e740591ef26280c023ddb33182b3"} Feb 23 06:52:52 crc kubenswrapper[4626]: I0223 06:52:52.473322 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" event={"ID":"4f172a38-4eaf-468a-99d8-99416128eef9","Type":"ContainerStarted","Data":"8e76a5a0ff29c56022a7dcce191a13d63a8690d8c589048b7f4cbb34de83a37c"} Feb 23 06:52:52 crc kubenswrapper[4626]: I0223 06:52:52.474371 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" event={"ID":"c7d90dff-0264-49d6-9d9e-ed5063ee6976","Type":"ContainerStarted","Data":"b7c8d8863b3d01ebe87f5fe91ef62e80e059081d9f438751ef96c297195d7c60"} Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.493825 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" event={"ID":"c7d90dff-0264-49d6-9d9e-ed5063ee6976","Type":"ContainerStarted","Data":"1c910acf401aae95f15899c52cdbf14e32ba973d172f3eee57561947a1b9e31c"} Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.495542 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mlgd2" event={"ID":"ab37c078-9524-4dd2-b8f3-450a17f5255d","Type":"ContainerStarted","Data":"987270532d5577238d45bb42395a6cbbd1b7ba5141e492d82bdca30b393b763e"} Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.497117 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" event={"ID":"4f172a38-4eaf-468a-99d8-99416128eef9","Type":"ContainerStarted","Data":"57964856fca120b602c8637ebfdffb2bd7968ef23100c1b82affa780b47bebd1"} Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.497224 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.542258 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-r5bt4" podStartSLOduration=1.339984748 podStartE2EDuration="4.542242609s" podCreationTimestamp="2026-02-23 06:52:51 +0000 UTC" firstStartedPulling="2026-02-23 06:52:51.973063807 +0000 UTC m=+724.312393073" lastFinishedPulling="2026-02-23 06:52:55.175321668 +0000 UTC m=+727.514650934" observedRunningTime="2026-02-23 06:52:55.537569388 +0000 UTC m=+727.876898654" watchObservedRunningTime="2026-02-23 06:52:55.542242609 +0000 UTC m=+727.881571876" Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.616207 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" podStartSLOduration=1.451074711 podStartE2EDuration="4.616186725s" podCreationTimestamp="2026-02-23 06:52:51 +0000 UTC" firstStartedPulling="2026-02-23 06:52:52.000935117 +0000 UTC m=+724.340264383" lastFinishedPulling="2026-02-23 06:52:55.166047131 +0000 UTC m=+727.505376397" observedRunningTime="2026-02-23 06:52:55.58471605 +0000 UTC m=+727.924045316" watchObservedRunningTime="2026-02-23 06:52:55.616186725 +0000 UTC m=+727.955515992" Feb 23 06:52:55 crc kubenswrapper[4626]: I0223 06:52:55.619136 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-mlgd2" podStartSLOduration=1.5758960850000001 podStartE2EDuration="4.619125217s" podCreationTimestamp="2026-02-23 06:52:51 +0000 UTC" firstStartedPulling="2026-02-23 06:52:52.145851827 +0000 UTC m=+724.485181093" lastFinishedPulling="2026-02-23 06:52:55.189080959 +0000 UTC m=+727.528410225" observedRunningTime="2026-02-23 06:52:55.616116644 +0000 UTC m=+727.955445910" watchObservedRunningTime="2026-02-23 06:52:55.619125217 +0000 UTC m=+727.958454472" Feb 23 06:53:01 crc kubenswrapper[4626]: I0223 06:53:01.750331 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-66srh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.323229 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lhplf"] Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.323970 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-controller" containerID="cri-o://68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.324056 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.324040 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="nbdb" containerID="cri-o://d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.324109 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-acl-logging" containerID="cri-o://695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.324108 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="sbdb" containerID="cri-o://56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.324054 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="northd" containerID="cri-o://9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.324349 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-node" containerID="cri-o://ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.363404 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" containerID="cri-o://592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" gracePeriod=30 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.540250 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/2.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.541199 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/1.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.541267 4626 generic.go:334] "Generic (PLEG): container finished" podID="27fe907f-67db-4a19-a485-22debfb92983" containerID="9a25087115100c9626d3a1eafde3dd594af1266341b73b36a08abdb447c9395e" exitCode=2 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.541327 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerDied","Data":"9a25087115100c9626d3a1eafde3dd594af1266341b73b36a08abdb447c9395e"} Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.541404 4626 scope.go:117] "RemoveContainer" containerID="fe671da65574052170670c85860a65375003b3cf5f0e9eec1b95967830d50649" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.542651 4626 scope.go:117] "RemoveContainer" containerID="9a25087115100c9626d3a1eafde3dd594af1266341b73b36a08abdb447c9395e" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.543165 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lbzx5_openshift-multus(27fe907f-67db-4a19-a485-22debfb92983)\"" pod="openshift-multus/multus-lbzx5" podUID="27fe907f-67db-4a19-a485-22debfb92983" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.547365 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/3.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.550166 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovn-acl-logging/0.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.550709 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovn-controller/0.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551545 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" exitCode=0 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551585 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" exitCode=0 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551597 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" exitCode=0 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551606 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" exitCode=143 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551616 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" exitCode=143 Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551551 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c"} Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551665 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83"} Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551684 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655"} Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551699 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a"} Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.551710 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b"} Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.575357 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovnkube-controller/3.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.576209 4626 scope.go:117] "RemoveContainer" containerID="2d5bf94c6e1739adf4073091389ed0858e936bb2cc113f67a702fa71308f36b2" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.578775 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovn-acl-logging/0.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.579622 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovn-controller/0.log" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.580217 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631278 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x6hqh"] Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631591 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631617 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631634 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631641 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631654 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631659 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631668 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="sbdb" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631674 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="sbdb" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631686 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="northd" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631693 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="northd" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631699 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631704 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631712 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631717 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631726 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631735 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631745 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="nbdb" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631751 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="nbdb" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631763 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kubecfg-setup" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631771 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kubecfg-setup" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631781 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-acl-logging" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631788 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-acl-logging" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.631795 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-node" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631801 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-node" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631929 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631945 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631952 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631961 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovn-acl-logging" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631971 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="sbdb" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631978 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-node" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631986 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="northd" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631991 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="nbdb" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.631997 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.632003 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: E0223 06:53:02.632113 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.632122 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.632233 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.632243 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerName="ovnkube-controller" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.634853 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735445 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-ovn-kubernetes\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735573 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-systemd\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735612 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-config\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735635 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-ovn\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735653 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-openvswitch\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735670 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-env-overrides\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735690 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-script-lib\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735732 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-kubelet\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735750 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-etc-openvswitch\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735775 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-netd\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735785 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735795 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-node-log\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735819 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-node-log" (OuterVolumeSpecName: "node-log") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735844 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735850 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5475\" (UniqueName: \"kubernetes.io/projected/a4eb8735-20e6-4bd1-8965-4a360e39a919-kube-api-access-b5475\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735864 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735885 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-var-lib-openvswitch\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735935 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735954 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-netns\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735972 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-bin\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.735995 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-log-socket\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736027 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovn-node-metrics-cert\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736053 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-slash\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736073 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-systemd-units\") pod \"a4eb8735-20e6-4bd1-8965-4a360e39a919\" (UID: \"a4eb8735-20e6-4bd1-8965-4a360e39a919\") " Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736223 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736233 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736263 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736287 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736307 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-systemd\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736348 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-etc-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736345 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736381 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-systemd-units\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736410 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-run-netns\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736414 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-log-socket" (OuterVolumeSpecName: "log-socket") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736434 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-slash\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736466 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-cni-bin\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736486 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-var-lib-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736538 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-log-socket\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736558 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736577 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-env-overrides\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736604 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wbln\" (UniqueName: \"kubernetes.io/projected/8541f24b-018e-42f0-93ad-29e70a36cbf8-kube-api-access-7wbln\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736628 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736640 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-node-log\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736659 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-kubelet\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736669 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736684 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovnkube-script-lib\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736723 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovn-node-metrics-cert\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736748 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-run-ovn-kubernetes\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736775 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-ovn\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736788 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-cni-netd\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736802 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovnkube-config\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736847 4626 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-log-socket\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736858 4626 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736869 4626 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736878 4626 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736885 4626 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736893 4626 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736902 4626 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736909 4626 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736918 4626 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736927 4626 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.736935 4626 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-node-log\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.737039 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.737061 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.737080 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.737099 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.737124 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-slash" (OuterVolumeSpecName: "host-slash") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.737145 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.742907 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4eb8735-20e6-4bd1-8965-4a360e39a919-kube-api-access-b5475" (OuterVolumeSpecName: "kube-api-access-b5475") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "kube-api-access-b5475". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.743191 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.750073 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a4eb8735-20e6-4bd1-8965-4a360e39a919" (UID: "a4eb8735-20e6-4bd1-8965-4a360e39a919"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837475 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wbln\" (UniqueName: \"kubernetes.io/projected/8541f24b-018e-42f0-93ad-29e70a36cbf8-kube-api-access-7wbln\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837541 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837562 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-node-log\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837580 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-kubelet\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837639 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovnkube-script-lib\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837665 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovn-node-metrics-cert\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837687 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-run-ovn-kubernetes\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837694 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837714 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-ovn\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837764 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-ovn\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837799 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-cni-netd\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837812 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-run-ovn-kubernetes\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837831 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovnkube-config\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837842 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-cni-netd\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837881 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-kubelet\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837884 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-node-log\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.837975 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-systemd\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838048 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-etc-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838058 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-run-systemd\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838118 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-systemd-units\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838168 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-etc-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838182 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-run-netns\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838206 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-systemd-units\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838244 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-slash\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838298 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-slash\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838302 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-cni-bin\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838330 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-cni-bin\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838344 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-var-lib-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838367 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-log-socket\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838368 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-run-netns\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838387 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838407 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-var-lib-openvswitch\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838413 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-env-overrides\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838444 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838457 4626 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838472 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5475\" (UniqueName: \"kubernetes.io/projected/a4eb8735-20e6-4bd1-8965-4a360e39a919-kube-api-access-b5475\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838483 4626 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838574 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8541f24b-018e-42f0-93ad-29e70a36cbf8-log-socket\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838614 4626 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838590 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovnkube-script-lib\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838634 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovnkube-config\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838697 4626 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838723 4626 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838738 4626 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a4eb8735-20e6-4bd1-8965-4a360e39a919-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838754 4626 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-host-slash\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.838766 4626 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a4eb8735-20e6-4bd1-8965-4a360e39a919-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.839034 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8541f24b-018e-42f0-93ad-29e70a36cbf8-env-overrides\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.841611 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8541f24b-018e-42f0-93ad-29e70a36cbf8-ovn-node-metrics-cert\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.852678 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wbln\" (UniqueName: \"kubernetes.io/projected/8541f24b-018e-42f0-93ad-29e70a36cbf8-kube-api-access-7wbln\") pod \"ovnkube-node-x6hqh\" (UID: \"8541f24b-018e-42f0-93ad-29e70a36cbf8\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:02 crc kubenswrapper[4626]: I0223 06:53:02.945942 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.560305 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovn-acl-logging/0.log" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561177 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhplf_a4eb8735-20e6-4bd1-8965-4a360e39a919/ovn-controller/0.log" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561532 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" exitCode=0 Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561558 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" exitCode=0 Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561567 4626 generic.go:334] "Generic (PLEG): container finished" podID="a4eb8735-20e6-4bd1-8965-4a360e39a919" containerID="9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" exitCode=0 Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561615 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561626 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85"} Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561662 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf"} Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561675 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288"} Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561684 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhplf" event={"ID":"a4eb8735-20e6-4bd1-8965-4a360e39a919","Type":"ContainerDied","Data":"a699ecbb18f2fbbaae7dca1605ee06866be003785ac7fd125e047c3a9b94bfa7"} Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.561703 4626 scope.go:117] "RemoveContainer" containerID="592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.563816 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/2.log" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.565352 4626 generic.go:334] "Generic (PLEG): container finished" podID="8541f24b-018e-42f0-93ad-29e70a36cbf8" containerID="e1cf9b7a96e4b206aa943dd1ddf3d5be3dd56cf0e3f03d1409dd33fc133c3c1b" exitCode=0 Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.565383 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerDied","Data":"e1cf9b7a96e4b206aa943dd1ddf3d5be3dd56cf0e3f03d1409dd33fc133c3c1b"} Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.565401 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"60d676c4649a316724cf2a63de0a8d88d78ba2ec0a54f71356b37ff3f45588ae"} Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.574559 4626 scope.go:117] "RemoveContainer" containerID="56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.598383 4626 scope.go:117] "RemoveContainer" containerID="d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.628560 4626 scope.go:117] "RemoveContainer" containerID="9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.630043 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lhplf"] Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.638033 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lhplf"] Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.644813 4626 scope.go:117] "RemoveContainer" containerID="c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.659170 4626 scope.go:117] "RemoveContainer" containerID="ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.669726 4626 scope.go:117] "RemoveContainer" containerID="695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.680637 4626 scope.go:117] "RemoveContainer" containerID="68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.698636 4626 scope.go:117] "RemoveContainer" containerID="9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.716808 4626 scope.go:117] "RemoveContainer" containerID="592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.717240 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": container with ID starting with 592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c not found: ID does not exist" containerID="592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.717294 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c"} err="failed to get container status \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": rpc error: code = NotFound desc = could not find container \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": container with ID starting with 592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.717327 4626 scope.go:117] "RemoveContainer" containerID="56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.717673 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": container with ID starting with 56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85 not found: ID does not exist" containerID="56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.717712 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85"} err="failed to get container status \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": rpc error: code = NotFound desc = could not find container \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": container with ID starting with 56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.717751 4626 scope.go:117] "RemoveContainer" containerID="d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.718111 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": container with ID starting with d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf not found: ID does not exist" containerID="d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.718143 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf"} err="failed to get container status \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": rpc error: code = NotFound desc = could not find container \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": container with ID starting with d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.718173 4626 scope.go:117] "RemoveContainer" containerID="9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.718542 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": container with ID starting with 9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288 not found: ID does not exist" containerID="9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.718562 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288"} err="failed to get container status \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": rpc error: code = NotFound desc = could not find container \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": container with ID starting with 9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.718577 4626 scope.go:117] "RemoveContainer" containerID="c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.718894 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": container with ID starting with c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83 not found: ID does not exist" containerID="c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.718913 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83"} err="failed to get container status \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": rpc error: code = NotFound desc = could not find container \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": container with ID starting with c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.718957 4626 scope.go:117] "RemoveContainer" containerID="ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.719220 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": container with ID starting with ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655 not found: ID does not exist" containerID="ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719242 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655"} err="failed to get container status \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": rpc error: code = NotFound desc = could not find container \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": container with ID starting with ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719257 4626 scope.go:117] "RemoveContainer" containerID="695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.719486 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": container with ID starting with 695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a not found: ID does not exist" containerID="695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719525 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a"} err="failed to get container status \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": rpc error: code = NotFound desc = could not find container \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": container with ID starting with 695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719537 4626 scope.go:117] "RemoveContainer" containerID="68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.719750 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": container with ID starting with 68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b not found: ID does not exist" containerID="68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719769 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b"} err="failed to get container status \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": rpc error: code = NotFound desc = could not find container \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": container with ID starting with 68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719781 4626 scope.go:117] "RemoveContainer" containerID="9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5" Feb 23 06:53:03 crc kubenswrapper[4626]: E0223 06:53:03.719958 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": container with ID starting with 9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5 not found: ID does not exist" containerID="9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719986 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5"} err="failed to get container status \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": rpc error: code = NotFound desc = could not find container \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": container with ID starting with 9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.719998 4626 scope.go:117] "RemoveContainer" containerID="592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720185 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c"} err="failed to get container status \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": rpc error: code = NotFound desc = could not find container \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": container with ID starting with 592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720213 4626 scope.go:117] "RemoveContainer" containerID="56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720377 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85"} err="failed to get container status \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": rpc error: code = NotFound desc = could not find container \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": container with ID starting with 56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720396 4626 scope.go:117] "RemoveContainer" containerID="d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720603 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf"} err="failed to get container status \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": rpc error: code = NotFound desc = could not find container \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": container with ID starting with d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720619 4626 scope.go:117] "RemoveContainer" containerID="9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720806 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288"} err="failed to get container status \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": rpc error: code = NotFound desc = could not find container \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": container with ID starting with 9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720819 4626 scope.go:117] "RemoveContainer" containerID="c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.720995 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83"} err="failed to get container status \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": rpc error: code = NotFound desc = could not find container \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": container with ID starting with c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721009 4626 scope.go:117] "RemoveContainer" containerID="ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721192 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655"} err="failed to get container status \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": rpc error: code = NotFound desc = could not find container \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": container with ID starting with ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721207 4626 scope.go:117] "RemoveContainer" containerID="695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721364 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a"} err="failed to get container status \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": rpc error: code = NotFound desc = could not find container \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": container with ID starting with 695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721382 4626 scope.go:117] "RemoveContainer" containerID="68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721575 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b"} err="failed to get container status \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": rpc error: code = NotFound desc = could not find container \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": container with ID starting with 68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721591 4626 scope.go:117] "RemoveContainer" containerID="9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721761 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5"} err="failed to get container status \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": rpc error: code = NotFound desc = could not find container \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": container with ID starting with 9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721780 4626 scope.go:117] "RemoveContainer" containerID="592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721945 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c"} err="failed to get container status \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": rpc error: code = NotFound desc = could not find container \"592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c\": container with ID starting with 592527a81386e20a68f34bbedb65db718c9ed78f6e794c421a6483444b0b890c not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.721960 4626 scope.go:117] "RemoveContainer" containerID="56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.722117 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85"} err="failed to get container status \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": rpc error: code = NotFound desc = could not find container \"56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85\": container with ID starting with 56d2e23930a49be7d71f5ca82597f2b82aa42392a640cbddd78ac10b3a260c85 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.722133 4626 scope.go:117] "RemoveContainer" containerID="d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.722430 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf"} err="failed to get container status \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": rpc error: code = NotFound desc = could not find container \"d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf\": container with ID starting with d30de37f6bc9591a3d0fb8739763599828e47cda21c11ff07c0a925f45d800cf not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.722454 4626 scope.go:117] "RemoveContainer" containerID="9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.723694 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288"} err="failed to get container status \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": rpc error: code = NotFound desc = could not find container \"9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288\": container with ID starting with 9b1c77fd96dc305fac373a24c1316dab78152df853b4b49859fabd5e23618288 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.723712 4626 scope.go:117] "RemoveContainer" containerID="c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.723945 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83"} err="failed to get container status \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": rpc error: code = NotFound desc = could not find container \"c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83\": container with ID starting with c3a343b900d1b801fba300159f483edb1a30a2f19a67b79df6b1bfc95f067d83 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.723960 4626 scope.go:117] "RemoveContainer" containerID="ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724148 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655"} err="failed to get container status \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": rpc error: code = NotFound desc = could not find container \"ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655\": container with ID starting with ea19ab657d1dbffd0f7c1dce39df2f46c226a3ae05810e91ba1e0c3cc36fb655 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724170 4626 scope.go:117] "RemoveContainer" containerID="695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724344 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a"} err="failed to get container status \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": rpc error: code = NotFound desc = could not find container \"695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a\": container with ID starting with 695babe89af992a7402684eba686b7050224257a6cb149e11a1e10fcbbc06d3a not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724358 4626 scope.go:117] "RemoveContainer" containerID="68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724551 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b"} err="failed to get container status \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": rpc error: code = NotFound desc = could not find container \"68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b\": container with ID starting with 68526ca93bd2c18f980e2a5c3e2e1f073cb71a0c35185ff3631b7d866bbdc01b not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724566 4626 scope.go:117] "RemoveContainer" containerID="9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.724720 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5"} err="failed to get container status \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": rpc error: code = NotFound desc = could not find container \"9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5\": container with ID starting with 9a0d87283cc1abe7f303ee582067ccd2143f7e3dec456840edeaf1d8a8db1ec5 not found: ID does not exist" Feb 23 06:53:03 crc kubenswrapper[4626]: I0223 06:53:03.994418 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4eb8735-20e6-4bd1-8965-4a360e39a919" path="/var/lib/kubelet/pods/a4eb8735-20e6-4bd1-8965-4a360e39a919/volumes" Feb 23 06:53:04 crc kubenswrapper[4626]: I0223 06:53:04.573391 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"4c37f6520c935d5ce79af7e0d80c27482a543bcbb35d416640295e28bff34c9e"} Feb 23 06:53:04 crc kubenswrapper[4626]: I0223 06:53:04.573695 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"d280480364e8e22bda6b785d2c344df8ae14941765f29b31ac20be4db75b4d84"} Feb 23 06:53:04 crc kubenswrapper[4626]: I0223 06:53:04.573712 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"f02bf2899c531422dbf60381894f633a512f842b4d0919c6f8d6714688c72137"} Feb 23 06:53:04 crc kubenswrapper[4626]: I0223 06:53:04.573721 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"57641a0e24e896a24e9ece8adc65722a9662e51e92d0dbdaf70072719215893b"} Feb 23 06:53:04 crc kubenswrapper[4626]: I0223 06:53:04.573731 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"ec184fcd30deafcf657d8b43121decfc2ce76dbba7ee5f74cd64ac0e122d1ebc"} Feb 23 06:53:04 crc kubenswrapper[4626]: I0223 06:53:04.573742 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"346d02b3a85e01ab945dca162748d15da20c7ada5d1b1d41c895878e7ba3db0d"} Feb 23 06:53:06 crc kubenswrapper[4626]: I0223 06:53:06.590477 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"37fc5eb7aae159a9ecc6059dc5c8171e43f9ba36ffd008878fc79c6077161a1f"} Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.617367 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" event={"ID":"8541f24b-018e-42f0-93ad-29e70a36cbf8","Type":"ContainerStarted","Data":"fb67f75b817af188544cba1a602d0de14aae462510d3b0d09efea4c067c503bd"} Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.617869 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.617966 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.618031 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.649764 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.657747 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" podStartSLOduration=6.657732936 podStartE2EDuration="6.657732936s" podCreationTimestamp="2026-02-23 06:53:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:53:08.650943675 +0000 UTC m=+740.990272942" watchObservedRunningTime="2026-02-23 06:53:08.657732936 +0000 UTC m=+740.997062202" Feb 23 06:53:08 crc kubenswrapper[4626]: I0223 06:53:08.669285 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:13 crc kubenswrapper[4626]: I0223 06:53:13.982399 4626 scope.go:117] "RemoveContainer" containerID="9a25087115100c9626d3a1eafde3dd594af1266341b73b36a08abdb447c9395e" Feb 23 06:53:13 crc kubenswrapper[4626]: E0223 06:53:13.983305 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-lbzx5_openshift-multus(27fe907f-67db-4a19-a485-22debfb92983)\"" pod="openshift-multus/multus-lbzx5" podUID="27fe907f-67db-4a19-a485-22debfb92983" Feb 23 06:53:25 crc kubenswrapper[4626]: I0223 06:53:25.685845 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:53:25 crc kubenswrapper[4626]: I0223 06:53:25.686473 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:53:26 crc kubenswrapper[4626]: I0223 06:53:26.981807 4626 scope.go:117] "RemoveContainer" containerID="9a25087115100c9626d3a1eafde3dd594af1266341b73b36a08abdb447c9395e" Feb 23 06:53:27 crc kubenswrapper[4626]: I0223 06:53:27.721820 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lbzx5_27fe907f-67db-4a19-a485-22debfb92983/kube-multus/2.log" Feb 23 06:53:27 crc kubenswrapper[4626]: I0223 06:53:27.722398 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lbzx5" event={"ID":"27fe907f-67db-4a19-a485-22debfb92983","Type":"ContainerStarted","Data":"c67b97f073ea2c9ade5e83d8bcc12d4a117c514982cca778b37a4c2a5867352b"} Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.019569 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh"] Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.021288 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.023707 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.035173 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh"] Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.087720 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.087887 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvhjh\" (UniqueName: \"kubernetes.io/projected/d5f558ac-cd5c-4de8-853a-e859cdb09641-kube-api-access-tvhjh\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.087955 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.188746 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.188840 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvhjh\" (UniqueName: \"kubernetes.io/projected/d5f558ac-cd5c-4de8-853a-e859cdb09641-kube-api-access-tvhjh\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.188875 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.189228 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.189299 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.208015 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvhjh\" (UniqueName: \"kubernetes.io/projected/d5f558ac-cd5c-4de8-853a-e859cdb09641-kube-api-access-tvhjh\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.339870 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.724566 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh"] Feb 23 06:53:32 crc kubenswrapper[4626]: W0223 06:53:32.731574 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f558ac_cd5c_4de8_853a_e859cdb09641.slice/crio-7507c5b8851259d21be04a290479a6f9e1c17a98407e1974776d2290f0a0177c WatchSource:0}: Error finding container 7507c5b8851259d21be04a290479a6f9e1c17a98407e1974776d2290f0a0177c: Status 404 returned error can't find the container with id 7507c5b8851259d21be04a290479a6f9e1c17a98407e1974776d2290f0a0177c Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.748921 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" event={"ID":"d5f558ac-cd5c-4de8-853a-e859cdb09641","Type":"ContainerStarted","Data":"7507c5b8851259d21be04a290479a6f9e1c17a98407e1974776d2290f0a0177c"} Feb 23 06:53:32 crc kubenswrapper[4626]: I0223 06:53:32.971395 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6hqh" Feb 23 06:53:33 crc kubenswrapper[4626]: I0223 06:53:33.756229 4626 generic.go:334] "Generic (PLEG): container finished" podID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerID="e6019c030fceda9a6da0503402867137bbe90888ce73171226f3065e65ef94a0" exitCode=0 Feb 23 06:53:33 crc kubenswrapper[4626]: I0223 06:53:33.756310 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" event={"ID":"d5f558ac-cd5c-4de8-853a-e859cdb09641","Type":"ContainerDied","Data":"e6019c030fceda9a6da0503402867137bbe90888ce73171226f3065e65ef94a0"} Feb 23 06:53:35 crc kubenswrapper[4626]: I0223 06:53:35.768167 4626 generic.go:334] "Generic (PLEG): container finished" podID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerID="859723d385fa90af473a1571e189593b38d383754f6814671f1626f6179823a9" exitCode=0 Feb 23 06:53:35 crc kubenswrapper[4626]: I0223 06:53:35.768296 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" event={"ID":"d5f558ac-cd5c-4de8-853a-e859cdb09641","Type":"ContainerDied","Data":"859723d385fa90af473a1571e189593b38d383754f6814671f1626f6179823a9"} Feb 23 06:53:36 crc kubenswrapper[4626]: I0223 06:53:36.776613 4626 generic.go:334] "Generic (PLEG): container finished" podID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerID="060b4992b8d0c95e36d39f59adfe64105a209a89ccbe7de284779434ab4175e7" exitCode=0 Feb 23 06:53:36 crc kubenswrapper[4626]: I0223 06:53:36.776663 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" event={"ID":"d5f558ac-cd5c-4de8-853a-e859cdb09641","Type":"ContainerDied","Data":"060b4992b8d0c95e36d39f59adfe64105a209a89ccbe7de284779434ab4175e7"} Feb 23 06:53:37 crc kubenswrapper[4626]: I0223 06:53:37.961409 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.161175 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-util\") pod \"d5f558ac-cd5c-4de8-853a-e859cdb09641\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.161241 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-bundle\") pod \"d5f558ac-cd5c-4de8-853a-e859cdb09641\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.161311 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvhjh\" (UniqueName: \"kubernetes.io/projected/d5f558ac-cd5c-4de8-853a-e859cdb09641-kube-api-access-tvhjh\") pod \"d5f558ac-cd5c-4de8-853a-e859cdb09641\" (UID: \"d5f558ac-cd5c-4de8-853a-e859cdb09641\") " Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.162022 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-bundle" (OuterVolumeSpecName: "bundle") pod "d5f558ac-cd5c-4de8-853a-e859cdb09641" (UID: "d5f558ac-cd5c-4de8-853a-e859cdb09641"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.167264 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f558ac-cd5c-4de8-853a-e859cdb09641-kube-api-access-tvhjh" (OuterVolumeSpecName: "kube-api-access-tvhjh") pod "d5f558ac-cd5c-4de8-853a-e859cdb09641" (UID: "d5f558ac-cd5c-4de8-853a-e859cdb09641"). InnerVolumeSpecName "kube-api-access-tvhjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.263139 4626 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.263175 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvhjh\" (UniqueName: \"kubernetes.io/projected/d5f558ac-cd5c-4de8-853a-e859cdb09641-kube-api-access-tvhjh\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.384784 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-util" (OuterVolumeSpecName: "util") pod "d5f558ac-cd5c-4de8-853a-e859cdb09641" (UID: "d5f558ac-cd5c-4de8-853a-e859cdb09641"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.466454 4626 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d5f558ac-cd5c-4de8-853a-e859cdb09641-util\") on node \"crc\" DevicePath \"\"" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.799366 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" event={"ID":"d5f558ac-cd5c-4de8-853a-e859cdb09641","Type":"ContainerDied","Data":"7507c5b8851259d21be04a290479a6f9e1c17a98407e1974776d2290f0a0177c"} Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.799437 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7507c5b8851259d21be04a290479a6f9e1c17a98407e1974776d2290f0a0177c" Feb 23 06:53:38 crc kubenswrapper[4626]: I0223 06:53:38.799456 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.849612 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-7j8mq"] Feb 23 06:53:39 crc kubenswrapper[4626]: E0223 06:53:39.850893 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="pull" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.850983 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="pull" Feb 23 06:53:39 crc kubenswrapper[4626]: E0223 06:53:39.851035 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="util" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.851082 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="util" Feb 23 06:53:39 crc kubenswrapper[4626]: E0223 06:53:39.851129 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="extract" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.851172 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="extract" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.851334 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f558ac-cd5c-4de8-853a-e859cdb09641" containerName="extract" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.851816 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.855547 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.855629 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.855560 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-t9c4l" Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.864869 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-7j8mq"] Feb 23 06:53:39 crc kubenswrapper[4626]: I0223 06:53:39.984697 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsnm\" (UniqueName: \"kubernetes.io/projected/33276922-ab9c-4bb0-ad0a-71ca54766841-kube-api-access-kgsnm\") pod \"nmstate-operator-694c9596b7-7j8mq\" (UID: \"33276922-ab9c-4bb0-ad0a-71ca54766841\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" Feb 23 06:53:40 crc kubenswrapper[4626]: I0223 06:53:40.085942 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsnm\" (UniqueName: \"kubernetes.io/projected/33276922-ab9c-4bb0-ad0a-71ca54766841-kube-api-access-kgsnm\") pod \"nmstate-operator-694c9596b7-7j8mq\" (UID: \"33276922-ab9c-4bb0-ad0a-71ca54766841\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" Feb 23 06:53:40 crc kubenswrapper[4626]: I0223 06:53:40.107938 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsnm\" (UniqueName: \"kubernetes.io/projected/33276922-ab9c-4bb0-ad0a-71ca54766841-kube-api-access-kgsnm\") pod \"nmstate-operator-694c9596b7-7j8mq\" (UID: \"33276922-ab9c-4bb0-ad0a-71ca54766841\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" Feb 23 06:53:40 crc kubenswrapper[4626]: I0223 06:53:40.164006 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" Feb 23 06:53:40 crc kubenswrapper[4626]: I0223 06:53:40.356023 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-7j8mq"] Feb 23 06:53:40 crc kubenswrapper[4626]: W0223 06:53:40.362158 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33276922_ab9c_4bb0_ad0a_71ca54766841.slice/crio-ce3a1ab015a415c718ecefb5fbb446d20c3bf368b4886e275532e72a82786aac WatchSource:0}: Error finding container ce3a1ab015a415c718ecefb5fbb446d20c3bf368b4886e275532e72a82786aac: Status 404 returned error can't find the container with id ce3a1ab015a415c718ecefb5fbb446d20c3bf368b4886e275532e72a82786aac Feb 23 06:53:40 crc kubenswrapper[4626]: I0223 06:53:40.810728 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" event={"ID":"33276922-ab9c-4bb0-ad0a-71ca54766841","Type":"ContainerStarted","Data":"ce3a1ab015a415c718ecefb5fbb446d20c3bf368b4886e275532e72a82786aac"} Feb 23 06:53:42 crc kubenswrapper[4626]: I0223 06:53:42.826665 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" event={"ID":"33276922-ab9c-4bb0-ad0a-71ca54766841","Type":"ContainerStarted","Data":"125f8b2000cce6e8b8bcb94ce2dfcf17b2b6c216457454b326eaf50bb1376faf"} Feb 23 06:53:42 crc kubenswrapper[4626]: I0223 06:53:42.847066 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-7j8mq" podStartSLOduration=1.90499105 podStartE2EDuration="3.847041433s" podCreationTimestamp="2026-02-23 06:53:39 +0000 UTC" firstStartedPulling="2026-02-23 06:53:40.365145642 +0000 UTC m=+772.704474908" lastFinishedPulling="2026-02-23 06:53:42.307196025 +0000 UTC m=+774.646525291" observedRunningTime="2026-02-23 06:53:42.842797071 +0000 UTC m=+775.182126327" watchObservedRunningTime="2026-02-23 06:53:42.847041433 +0000 UTC m=+775.186370699" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.646188 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.647267 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.650994 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xntkf" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.658720 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.675838 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.676526 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.686000 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.698910 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6fccn"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.699714 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.712940 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739187 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmv5d\" (UniqueName: \"kubernetes.io/projected/fbb43c1e-8e84-47ab-8bca-d2b1fc06efce-kube-api-access-zmv5d\") pod \"nmstate-webhook-866bcb46dc-pvm5b\" (UID: \"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739381 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-dbus-socket\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739464 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-nmstate-lock\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739574 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-ovs-socket\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739664 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbmb\" (UniqueName: \"kubernetes.io/projected/0b5fd810-1004-455d-ac3c-b7d5fc387861-kube-api-access-zhbmb\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739769 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fbb43c1e-8e84-47ab-8bca-d2b1fc06efce-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pvm5b\" (UID: \"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.739847 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snvd\" (UniqueName: \"kubernetes.io/projected/89312997-cb41-4e39-9fb3-bb07a7b5d7b6-kube-api-access-7snvd\") pod \"nmstate-metrics-58c85c668d-5dgzp\" (UID: \"89312997-cb41-4e39-9fb3-bb07a7b5d7b6\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.811822 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.812988 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.820442 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-drj85" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.820484 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.820833 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.841467 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-dbus-socket\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.841913 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-dbus-socket\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842096 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-nmstate-lock\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842174 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-nmstate-lock\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842248 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842306 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-ovs-socket\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842382 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbmb\" (UniqueName: \"kubernetes.io/projected/0b5fd810-1004-455d-ac3c-b7d5fc387861-kube-api-access-zhbmb\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842423 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0b5fd810-1004-455d-ac3c-b7d5fc387861-ovs-socket\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842451 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6c9\" (UniqueName: \"kubernetes.io/projected/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-kube-api-access-cw6c9\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842531 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fbb43c1e-8e84-47ab-8bca-d2b1fc06efce-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pvm5b\" (UID: \"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.842617 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snvd\" (UniqueName: \"kubernetes.io/projected/89312997-cb41-4e39-9fb3-bb07a7b5d7b6-kube-api-access-7snvd\") pod \"nmstate-metrics-58c85c668d-5dgzp\" (UID: \"89312997-cb41-4e39-9fb3-bb07a7b5d7b6\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.843850 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmv5d\" (UniqueName: \"kubernetes.io/projected/fbb43c1e-8e84-47ab-8bca-d2b1fc06efce-kube-api-access-zmv5d\") pod \"nmstate-webhook-866bcb46dc-pvm5b\" (UID: \"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.844028 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.851476 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/fbb43c1e-8e84-47ab-8bca-d2b1fc06efce-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pvm5b\" (UID: \"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.865128 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmv5d\" (UniqueName: \"kubernetes.io/projected/fbb43c1e-8e84-47ab-8bca-d2b1fc06efce-kube-api-access-zmv5d\") pod \"nmstate-webhook-866bcb46dc-pvm5b\" (UID: \"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.867161 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snvd\" (UniqueName: \"kubernetes.io/projected/89312997-cb41-4e39-9fb3-bb07a7b5d7b6-kube-api-access-7snvd\") pod \"nmstate-metrics-58c85c668d-5dgzp\" (UID: \"89312997-cb41-4e39-9fb3-bb07a7b5d7b6\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.874514 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r"] Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.879251 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbmb\" (UniqueName: \"kubernetes.io/projected/0b5fd810-1004-455d-ac3c-b7d5fc387861-kube-api-access-zhbmb\") pod \"nmstate-handler-6fccn\" (UID: \"0b5fd810-1004-455d-ac3c-b7d5fc387861\") " pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.945874 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6c9\" (UniqueName: \"kubernetes.io/projected/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-kube-api-access-cw6c9\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.946103 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.946198 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.947400 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.950240 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.960658 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.963435 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6c9\" (UniqueName: \"kubernetes.io/projected/ba7e56b0-e6c6-434c-9f46-c4526c1448f7-kube-api-access-cw6c9\") pod \"nmstate-console-plugin-5c78fc5d65-s4n9r\" (UID: \"ba7e56b0-e6c6-434c-9f46-c4526c1448f7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:43 crc kubenswrapper[4626]: I0223 06:53:43.996408 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.016699 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.019672 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cd94474c7-m8ch2"] Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.020429 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.047608 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c336f3-a562-4204-b2a2-b089ea0682ad-console-serving-cert\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.047747 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-service-ca\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.047854 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-trusted-ca-bundle\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.047977 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9qwt\" (UniqueName: \"kubernetes.io/projected/19c336f3-a562-4204-b2a2-b089ea0682ad-kube-api-access-p9qwt\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.048084 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-console-config\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.048168 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-oauth-serving-cert\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.048263 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c336f3-a562-4204-b2a2-b089ea0682ad-console-oauth-config\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.095662 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cd94474c7-m8ch2"] Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.131179 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149337 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c336f3-a562-4204-b2a2-b089ea0682ad-console-oauth-config\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149433 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c336f3-a562-4204-b2a2-b089ea0682ad-console-serving-cert\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149469 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-service-ca\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149586 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-trusted-ca-bundle\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149649 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9qwt\" (UniqueName: \"kubernetes.io/projected/19c336f3-a562-4204-b2a2-b089ea0682ad-kube-api-access-p9qwt\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149698 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-console-config\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.149720 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-oauth-serving-cert\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.150490 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-oauth-serving-cert\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.151577 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-console-config\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.151663 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-trusted-ca-bundle\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.152026 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c336f3-a562-4204-b2a2-b089ea0682ad-service-ca\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.157603 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c336f3-a562-4204-b2a2-b089ea0682ad-console-serving-cert\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.163903 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c336f3-a562-4204-b2a2-b089ea0682ad-console-oauth-config\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.164982 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9qwt\" (UniqueName: \"kubernetes.io/projected/19c336f3-a562-4204-b2a2-b089ea0682ad-kube-api-access-p9qwt\") pod \"console-5cd94474c7-m8ch2\" (UID: \"19c336f3-a562-4204-b2a2-b089ea0682ad\") " pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.253872 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b"] Feb 23 06:53:44 crc kubenswrapper[4626]: W0223 06:53:44.261865 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb43c1e_8e84_47ab_8bca_d2b1fc06efce.slice/crio-70d32754f8d4a3bce0061236f18258c1d2ddf37f3bad0a8c3393220b1550de4e WatchSource:0}: Error finding container 70d32754f8d4a3bce0061236f18258c1d2ddf37f3bad0a8c3393220b1550de4e: Status 404 returned error can't find the container with id 70d32754f8d4a3bce0061236f18258c1d2ddf37f3bad0a8c3393220b1550de4e Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.339554 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.433860 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp"] Feb 23 06:53:44 crc kubenswrapper[4626]: W0223 06:53:44.446300 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89312997_cb41_4e39_9fb3_bb07a7b5d7b6.slice/crio-b2eb820761676e892a4176172d2864dd1cdaac18d654ff16add1042f7153e852 WatchSource:0}: Error finding container b2eb820761676e892a4176172d2864dd1cdaac18d654ff16add1042f7153e852: Status 404 returned error can't find the container with id b2eb820761676e892a4176172d2864dd1cdaac18d654ff16add1042f7153e852 Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.527049 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r"] Feb 23 06:53:44 crc kubenswrapper[4626]: W0223 06:53:44.533022 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7e56b0_e6c6_434c_9f46_c4526c1448f7.slice/crio-da8ecf5505da590f5852a59dfa830b9b29c047f4ab4c911d267573b0984f2ff6 WatchSource:0}: Error finding container da8ecf5505da590f5852a59dfa830b9b29c047f4ab4c911d267573b0984f2ff6: Status 404 returned error can't find the container with id da8ecf5505da590f5852a59dfa830b9b29c047f4ab4c911d267573b0984f2ff6 Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.540907 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cd94474c7-m8ch2"] Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.838576 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" event={"ID":"89312997-cb41-4e39-9fb3-bb07a7b5d7b6","Type":"ContainerStarted","Data":"b2eb820761676e892a4176172d2864dd1cdaac18d654ff16add1042f7153e852"} Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.839679 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" event={"ID":"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce","Type":"ContainerStarted","Data":"70d32754f8d4a3bce0061236f18258c1d2ddf37f3bad0a8c3393220b1550de4e"} Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.840617 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" event={"ID":"ba7e56b0-e6c6-434c-9f46-c4526c1448f7","Type":"ContainerStarted","Data":"da8ecf5505da590f5852a59dfa830b9b29c047f4ab4c911d267573b0984f2ff6"} Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.841761 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cd94474c7-m8ch2" event={"ID":"19c336f3-a562-4204-b2a2-b089ea0682ad","Type":"ContainerStarted","Data":"34089aca3d11141c73c2758c7d97fcd06abbf02759e86c49c7ba87b11c21fd8b"} Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.841810 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cd94474c7-m8ch2" event={"ID":"19c336f3-a562-4204-b2a2-b089ea0682ad","Type":"ContainerStarted","Data":"b12b5631b79ada939e3b312bd77efd8340614efe55212a26df82dd3d2f3c1d1e"} Feb 23 06:53:44 crc kubenswrapper[4626]: I0223 06:53:44.842453 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6fccn" event={"ID":"0b5fd810-1004-455d-ac3c-b7d5fc387861","Type":"ContainerStarted","Data":"54f2e218ddb86ef64d0f7fbadae48c5a77026fba4548818986cd1f71f94ff436"} Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.871382 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" event={"ID":"fbb43c1e-8e84-47ab-8bca-d2b1fc06efce","Type":"ContainerStarted","Data":"7f9bf5439b6af3426dfe5edc30700be3edcac926dd33723eb726a7e3805c8ba8"} Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.872085 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.873389 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" event={"ID":"ba7e56b0-e6c6-434c-9f46-c4526c1448f7","Type":"ContainerStarted","Data":"c1b5295c993fbcd20849f79fbec077a99ecea2087a3809e4709a17359aa0de7d"} Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.875575 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6fccn" event={"ID":"0b5fd810-1004-455d-ac3c-b7d5fc387861","Type":"ContainerStarted","Data":"d3ae89de071887b0634d0d8e2587fd8480ec0fb4a8fe98d685109de6794b9fd9"} Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.875722 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.876955 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" event={"ID":"89312997-cb41-4e39-9fb3-bb07a7b5d7b6","Type":"ContainerStarted","Data":"2b9bab850a946c989bd63e7b600169bbc0b7922dea2a636be5085020604ea1a1"} Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.889408 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cd94474c7-m8ch2" podStartSLOduration=3.889384208 podStartE2EDuration="3.889384208s" podCreationTimestamp="2026-02-23 06:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:53:44.862957634 +0000 UTC m=+777.202286901" watchObservedRunningTime="2026-02-23 06:53:47.889384208 +0000 UTC m=+780.228713473" Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.891922 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" podStartSLOduration=2.116069228 podStartE2EDuration="4.891909659s" podCreationTimestamp="2026-02-23 06:53:43 +0000 UTC" firstStartedPulling="2026-02-23 06:53:44.264605927 +0000 UTC m=+776.603935193" lastFinishedPulling="2026-02-23 06:53:47.040446358 +0000 UTC m=+779.379775624" observedRunningTime="2026-02-23 06:53:47.886387787 +0000 UTC m=+780.225717054" watchObservedRunningTime="2026-02-23 06:53:47.891909659 +0000 UTC m=+780.231238925" Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.904698 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6fccn" podStartSLOduration=1.924324578 podStartE2EDuration="4.904683704s" podCreationTimestamp="2026-02-23 06:53:43 +0000 UTC" firstStartedPulling="2026-02-23 06:53:44.062273925 +0000 UTC m=+776.401603191" lastFinishedPulling="2026-02-23 06:53:47.042633051 +0000 UTC m=+779.381962317" observedRunningTime="2026-02-23 06:53:47.901242075 +0000 UTC m=+780.240571341" watchObservedRunningTime="2026-02-23 06:53:47.904683704 +0000 UTC m=+780.244012970" Feb 23 06:53:47 crc kubenswrapper[4626]: I0223 06:53:47.921449 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-s4n9r" podStartSLOduration=2.425782609 podStartE2EDuration="4.921427051s" podCreationTimestamp="2026-02-23 06:53:43 +0000 UTC" firstStartedPulling="2026-02-23 06:53:44.537997096 +0000 UTC m=+776.877326351" lastFinishedPulling="2026-02-23 06:53:47.033641538 +0000 UTC m=+779.372970793" observedRunningTime="2026-02-23 06:53:47.912425901 +0000 UTC m=+780.251755167" watchObservedRunningTime="2026-02-23 06:53:47.921427051 +0000 UTC m=+780.260756318" Feb 23 06:53:49 crc kubenswrapper[4626]: I0223 06:53:49.894699 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" event={"ID":"89312997-cb41-4e39-9fb3-bb07a7b5d7b6","Type":"ContainerStarted","Data":"5770df9e6489101511232eb6a502a45aafdb63d6fd6db89f302aba6f2ae9aba0"} Feb 23 06:53:49 crc kubenswrapper[4626]: I0223 06:53:49.909951 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-5dgzp" podStartSLOduration=2.112464561 podStartE2EDuration="6.909927873s" podCreationTimestamp="2026-02-23 06:53:43 +0000 UTC" firstStartedPulling="2026-02-23 06:53:44.450747903 +0000 UTC m=+776.790077159" lastFinishedPulling="2026-02-23 06:53:49.248211215 +0000 UTC m=+781.587540471" observedRunningTime="2026-02-23 06:53:49.908331493 +0000 UTC m=+782.247660769" watchObservedRunningTime="2026-02-23 06:53:49.909927873 +0000 UTC m=+782.249257139" Feb 23 06:53:54 crc kubenswrapper[4626]: I0223 06:53:54.035753 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6fccn" Feb 23 06:53:54 crc kubenswrapper[4626]: I0223 06:53:54.340580 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:54 crc kubenswrapper[4626]: I0223 06:53:54.340627 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:54 crc kubenswrapper[4626]: I0223 06:53:54.346185 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:54 crc kubenswrapper[4626]: I0223 06:53:54.928796 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cd94474c7-m8ch2" Feb 23 06:53:54 crc kubenswrapper[4626]: I0223 06:53:54.981179 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ktkzv"] Feb 23 06:53:55 crc kubenswrapper[4626]: I0223 06:53:55.685746 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:53:55 crc kubenswrapper[4626]: I0223 06:53:55.687130 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:54:04 crc kubenswrapper[4626]: I0223 06:54:04.002538 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pvm5b" Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.879975 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv"] Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.881914 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.884122 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.897058 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv"] Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.955188 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfr9z\" (UniqueName: \"kubernetes.io/projected/24e5b70d-095c-49f0-93d4-89ba91e35a57-kube-api-access-pfr9z\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.955281 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:14 crc kubenswrapper[4626]: I0223 06:54:14.955351 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.056485 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfr9z\" (UniqueName: \"kubernetes.io/projected/24e5b70d-095c-49f0-93d4-89ba91e35a57-kube-api-access-pfr9z\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.056571 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.056623 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.057090 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.057849 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.075266 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfr9z\" (UniqueName: \"kubernetes.io/projected/24e5b70d-095c-49f0-93d4-89ba91e35a57-kube-api-access-pfr9z\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.202515 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:15 crc kubenswrapper[4626]: I0223 06:54:15.392823 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv"] Feb 23 06:54:16 crc kubenswrapper[4626]: I0223 06:54:16.063020 4626 generic.go:334] "Generic (PLEG): container finished" podID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerID="8fe34aa4a2c106bcb2932762db83797e86f15cca4b32ecf419b0f5b7fb6b45f8" exitCode=0 Feb 23 06:54:16 crc kubenswrapper[4626]: I0223 06:54:16.063119 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" event={"ID":"24e5b70d-095c-49f0-93d4-89ba91e35a57","Type":"ContainerDied","Data":"8fe34aa4a2c106bcb2932762db83797e86f15cca4b32ecf419b0f5b7fb6b45f8"} Feb 23 06:54:16 crc kubenswrapper[4626]: I0223 06:54:16.063407 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" event={"ID":"24e5b70d-095c-49f0-93d4-89ba91e35a57","Type":"ContainerStarted","Data":"093675f0919d19f36dfc66747946911767580f707733b7069d3d6262991b2777"} Feb 23 06:54:18 crc kubenswrapper[4626]: I0223 06:54:18.081055 4626 generic.go:334] "Generic (PLEG): container finished" podID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerID="5cb0c784b41d97367205384095b72c92369208f753cdbeb16d70d1635d1f03e5" exitCode=0 Feb 23 06:54:18 crc kubenswrapper[4626]: I0223 06:54:18.081153 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" event={"ID":"24e5b70d-095c-49f0-93d4-89ba91e35a57","Type":"ContainerDied","Data":"5cb0c784b41d97367205384095b72c92369208f753cdbeb16d70d1635d1f03e5"} Feb 23 06:54:19 crc kubenswrapper[4626]: I0223 06:54:19.091971 4626 generic.go:334] "Generic (PLEG): container finished" podID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerID="59f2f341a9f7b7a53880db86bc0d3d76892b47e5f5d260985708d1f376bc3aaf" exitCode=0 Feb 23 06:54:19 crc kubenswrapper[4626]: I0223 06:54:19.092076 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" event={"ID":"24e5b70d-095c-49f0-93d4-89ba91e35a57","Type":"ContainerDied","Data":"59f2f341a9f7b7a53880db86bc0d3d76892b47e5f5d260985708d1f376bc3aaf"} Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.013916 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ktkzv" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" containerID="cri-o://8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be" gracePeriod=15 Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.305995 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.356057 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ktkzv_4abfb5ed-4161-41d1-9cb5-70a93c60e109/console/0.log" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.356139 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.427849 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-oauth-config\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.427974 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-util\") pod \"24e5b70d-095c-49f0-93d4-89ba91e35a57\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428036 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-bundle\") pod \"24e5b70d-095c-49f0-93d4-89ba91e35a57\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428080 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-trusted-ca-bundle\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428150 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-config\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428169 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-service-ca\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428188 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfr9z\" (UniqueName: \"kubernetes.io/projected/24e5b70d-095c-49f0-93d4-89ba91e35a57-kube-api-access-pfr9z\") pod \"24e5b70d-095c-49f0-93d4-89ba91e35a57\" (UID: \"24e5b70d-095c-49f0-93d4-89ba91e35a57\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428221 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-serving-cert\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428240 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgp6\" (UniqueName: \"kubernetes.io/projected/4abfb5ed-4161-41d1-9cb5-70a93c60e109-kube-api-access-4fgp6\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.428265 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-oauth-serving-cert\") pod \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\" (UID: \"4abfb5ed-4161-41d1-9cb5-70a93c60e109\") " Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.429014 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.429036 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.429538 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-service-ca" (OuterVolumeSpecName: "service-ca") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.429667 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-config" (OuterVolumeSpecName: "console-config") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.429984 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-bundle" (OuterVolumeSpecName: "bundle") pod "24e5b70d-095c-49f0-93d4-89ba91e35a57" (UID: "24e5b70d-095c-49f0-93d4-89ba91e35a57"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.435738 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e5b70d-095c-49f0-93d4-89ba91e35a57-kube-api-access-pfr9z" (OuterVolumeSpecName: "kube-api-access-pfr9z") pod "24e5b70d-095c-49f0-93d4-89ba91e35a57" (UID: "24e5b70d-095c-49f0-93d4-89ba91e35a57"). InnerVolumeSpecName "kube-api-access-pfr9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.436431 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.436476 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abfb5ed-4161-41d1-9cb5-70a93c60e109-kube-api-access-4fgp6" (OuterVolumeSpecName: "kube-api-access-4fgp6") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "kube-api-access-4fgp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.437632 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4abfb5ed-4161-41d1-9cb5-70a93c60e109" (UID: "4abfb5ed-4161-41d1-9cb5-70a93c60e109"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.439267 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-util" (OuterVolumeSpecName: "util") pod "24e5b70d-095c-49f0-93d4-89ba91e35a57" (UID: "24e5b70d-095c-49f0-93d4-89ba91e35a57"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530323 4626 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530363 4626 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530380 4626 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530390 4626 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530402 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfr9z\" (UniqueName: \"kubernetes.io/projected/24e5b70d-095c-49f0-93d4-89ba91e35a57-kube-api-access-pfr9z\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530414 4626 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530426 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgp6\" (UniqueName: \"kubernetes.io/projected/4abfb5ed-4161-41d1-9cb5-70a93c60e109-kube-api-access-4fgp6\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530439 4626 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4abfb5ed-4161-41d1-9cb5-70a93c60e109-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530448 4626 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4abfb5ed-4161-41d1-9cb5-70a93c60e109-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:20 crc kubenswrapper[4626]: I0223 06:54:20.530459 4626 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24e5b70d-095c-49f0-93d4-89ba91e35a57-util\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.109900 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.109916 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv" event={"ID":"24e5b70d-095c-49f0-93d4-89ba91e35a57","Type":"ContainerDied","Data":"093675f0919d19f36dfc66747946911767580f707733b7069d3d6262991b2777"} Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.109966 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="093675f0919d19f36dfc66747946911767580f707733b7069d3d6262991b2777" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.111413 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ktkzv_4abfb5ed-4161-41d1-9cb5-70a93c60e109/console/0.log" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.111474 4626 generic.go:334] "Generic (PLEG): container finished" podID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerID="8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be" exitCode=2 Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.111530 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ktkzv" event={"ID":"4abfb5ed-4161-41d1-9cb5-70a93c60e109","Type":"ContainerDied","Data":"8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be"} Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.111567 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ktkzv" event={"ID":"4abfb5ed-4161-41d1-9cb5-70a93c60e109","Type":"ContainerDied","Data":"0cf44727097895470c05496903ad94e3192a1acac67c4f089be04dad2cbafda6"} Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.111590 4626 scope.go:117] "RemoveContainer" containerID="8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.111735 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ktkzv" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.148355 4626 scope.go:117] "RemoveContainer" containerID="8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be" Feb 23 06:54:21 crc kubenswrapper[4626]: E0223 06:54:21.149321 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be\": container with ID starting with 8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be not found: ID does not exist" containerID="8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.149381 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be"} err="failed to get container status \"8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be\": rpc error: code = NotFound desc = could not find container \"8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be\": container with ID starting with 8a33acab6a9bc864124b578316c429316fcc70550cdc44701e51d6a4bfb6a5be not found: ID does not exist" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.171911 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ktkzv"] Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.174711 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ktkzv"] Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.320815 4626 patch_prober.go:28] interesting pod/console-f9d7485db-ktkzv container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.320877 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-ktkzv" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:54:21 crc kubenswrapper[4626]: I0223 06:54:21.990361 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" path="/var/lib/kubelet/pods/4abfb5ed-4161-41d1-9cb5-70a93c60e109/volumes" Feb 23 06:54:25 crc kubenswrapper[4626]: I0223 06:54:25.685228 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:54:25 crc kubenswrapper[4626]: I0223 06:54:25.686447 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:54:25 crc kubenswrapper[4626]: I0223 06:54:25.686568 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:54:25 crc kubenswrapper[4626]: I0223 06:54:25.687331 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e82542f36ca6c092b0099e501573dbf83452c1447d038b617beda611bf3799cf"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:54:25 crc kubenswrapper[4626]: I0223 06:54:25.687404 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://e82542f36ca6c092b0099e501573dbf83452c1447d038b617beda611bf3799cf" gracePeriod=600 Feb 23 06:54:26 crc kubenswrapper[4626]: I0223 06:54:26.150395 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="e82542f36ca6c092b0099e501573dbf83452c1447d038b617beda611bf3799cf" exitCode=0 Feb 23 06:54:26 crc kubenswrapper[4626]: I0223 06:54:26.150489 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"e82542f36ca6c092b0099e501573dbf83452c1447d038b617beda611bf3799cf"} Feb 23 06:54:26 crc kubenswrapper[4626]: I0223 06:54:26.150874 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"3524b229b91a471ae481a5ae0ae84698672cefa6dfcb3ba75eb9b84f9fff35b3"} Feb 23 06:54:26 crc kubenswrapper[4626]: I0223 06:54:26.150904 4626 scope.go:117] "RemoveContainer" containerID="5bd8b825f9633bd9403c87dfeb8220ce56a9e848ab820c72cccea63f1dfee03e" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.634514 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq"] Feb 23 06:54:30 crc kubenswrapper[4626]: E0223 06:54:30.635226 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="pull" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635240 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="pull" Feb 23 06:54:30 crc kubenswrapper[4626]: E0223 06:54:30.635248 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="util" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635253 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="util" Feb 23 06:54:30 crc kubenswrapper[4626]: E0223 06:54:30.635272 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="extract" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635277 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="extract" Feb 23 06:54:30 crc kubenswrapper[4626]: E0223 06:54:30.635288 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635294 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635421 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e5b70d-095c-49f0-93d4-89ba91e35a57" containerName="extract" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635431 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abfb5ed-4161-41d1-9cb5-70a93c60e109" containerName="console" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.635893 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.638160 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-n7nz5" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.638447 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.638625 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.639211 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.639697 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.654144 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq"] Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.700861 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5605b9df-eb52-4cb0-8f48-a273404aaf5d-webhook-cert\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.700949 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5605b9df-eb52-4cb0-8f48-a273404aaf5d-apiservice-cert\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.701031 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64dds\" (UniqueName: \"kubernetes.io/projected/5605b9df-eb52-4cb0-8f48-a273404aaf5d-kube-api-access-64dds\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.803226 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5605b9df-eb52-4cb0-8f48-a273404aaf5d-webhook-cert\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.803350 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5605b9df-eb52-4cb0-8f48-a273404aaf5d-apiservice-cert\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.803445 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64dds\" (UniqueName: \"kubernetes.io/projected/5605b9df-eb52-4cb0-8f48-a273404aaf5d-kube-api-access-64dds\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.810377 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5605b9df-eb52-4cb0-8f48-a273404aaf5d-apiservice-cert\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.812583 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5605b9df-eb52-4cb0-8f48-a273404aaf5d-webhook-cert\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.821943 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64dds\" (UniqueName: \"kubernetes.io/projected/5605b9df-eb52-4cb0-8f48-a273404aaf5d-kube-api-access-64dds\") pod \"metallb-operator-controller-manager-7df8f6cc8-jqkkq\" (UID: \"5605b9df-eb52-4cb0-8f48-a273404aaf5d\") " pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.871372 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr"] Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.872063 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.873790 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.873796 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.874131 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lcl2g" Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.890450 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr"] Feb 23 06:54:30 crc kubenswrapper[4626]: I0223 06:54:30.949675 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.005951 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk4zk\" (UniqueName: \"kubernetes.io/projected/e6756a3b-e295-453e-a3a6-1bc81275c97b-kube-api-access-nk4zk\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.006043 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6756a3b-e295-453e-a3a6-1bc81275c97b-webhook-cert\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.006066 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6756a3b-e295-453e-a3a6-1bc81275c97b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.109416 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk4zk\" (UniqueName: \"kubernetes.io/projected/e6756a3b-e295-453e-a3a6-1bc81275c97b-kube-api-access-nk4zk\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.109744 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6756a3b-e295-453e-a3a6-1bc81275c97b-webhook-cert\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.109772 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6756a3b-e295-453e-a3a6-1bc81275c97b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.117257 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6756a3b-e295-453e-a3a6-1bc81275c97b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.121732 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6756a3b-e295-453e-a3a6-1bc81275c97b-webhook-cert\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.130036 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk4zk\" (UniqueName: \"kubernetes.io/projected/e6756a3b-e295-453e-a3a6-1bc81275c97b-kube-api-access-nk4zk\") pod \"metallb-operator-webhook-server-6b49bc4cb8-dxzbr\" (UID: \"e6756a3b-e295-453e-a3a6-1bc81275c97b\") " pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.184158 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.434473 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq"] Feb 23 06:54:31 crc kubenswrapper[4626]: I0223 06:54:31.556234 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr"] Feb 23 06:54:31 crc kubenswrapper[4626]: W0223 06:54:31.561032 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6756a3b_e295_453e_a3a6_1bc81275c97b.slice/crio-7c62dc0f782fd0e7d75b1b53d862f8c69531cf86d65ed2808378a37577fae421 WatchSource:0}: Error finding container 7c62dc0f782fd0e7d75b1b53d862f8c69531cf86d65ed2808378a37577fae421: Status 404 returned error can't find the container with id 7c62dc0f782fd0e7d75b1b53d862f8c69531cf86d65ed2808378a37577fae421 Feb 23 06:54:32 crc kubenswrapper[4626]: I0223 06:54:32.194387 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" event={"ID":"5605b9df-eb52-4cb0-8f48-a273404aaf5d","Type":"ContainerStarted","Data":"352f55104b93566b679a83d3c0184cb0815d608bbe0d7d05a7e2a02671f21f28"} Feb 23 06:54:32 crc kubenswrapper[4626]: I0223 06:54:32.195749 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" event={"ID":"e6756a3b-e295-453e-a3a6-1bc81275c97b","Type":"ContainerStarted","Data":"7c62dc0f782fd0e7d75b1b53d862f8c69531cf86d65ed2808378a37577fae421"} Feb 23 06:54:34 crc kubenswrapper[4626]: I0223 06:54:34.092592 4626 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:54:36 crc kubenswrapper[4626]: I0223 06:54:36.842663 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5l5lw"] Feb 23 06:54:36 crc kubenswrapper[4626]: I0223 06:54:36.844530 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:36 crc kubenswrapper[4626]: I0223 06:54:36.866768 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5l5lw"] Feb 23 06:54:36 crc kubenswrapper[4626]: I0223 06:54:36.911655 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7smj\" (UniqueName: \"kubernetes.io/projected/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-kube-api-access-c7smj\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:36 crc kubenswrapper[4626]: I0223 06:54:36.911809 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-catalog-content\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:36 crc kubenswrapper[4626]: I0223 06:54:36.911884 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-utilities\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.013105 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-utilities\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.013460 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7smj\" (UniqueName: \"kubernetes.io/projected/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-kube-api-access-c7smj\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.013491 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-catalog-content\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.013580 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-utilities\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.013845 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-catalog-content\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.030248 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7smj\" (UniqueName: \"kubernetes.io/projected/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-kube-api-access-c7smj\") pod \"certified-operators-5l5lw\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.159373 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.255432 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" event={"ID":"5605b9df-eb52-4cb0-8f48-a273404aaf5d","Type":"ContainerStarted","Data":"9ddcc954a2e326e174816d19a6de2d169e80e9e6ea658dd75d9776e3ec8cdf0a"} Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.256491 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.269483 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" event={"ID":"e6756a3b-e295-453e-a3a6-1bc81275c97b","Type":"ContainerStarted","Data":"205099901b26b5032ceaf385d3b768c45dfc6b207383388377ed58639d91877c"} Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.269936 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.297629 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" podStartSLOduration=2.296258418 podStartE2EDuration="7.297611379s" podCreationTimestamp="2026-02-23 06:54:30 +0000 UTC" firstStartedPulling="2026-02-23 06:54:31.445513261 +0000 UTC m=+823.784842527" lastFinishedPulling="2026-02-23 06:54:36.446866222 +0000 UTC m=+828.786195488" observedRunningTime="2026-02-23 06:54:37.296437837 +0000 UTC m=+829.635767092" watchObservedRunningTime="2026-02-23 06:54:37.297611379 +0000 UTC m=+829.636940645" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.332314 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" podStartSLOduration=2.433388217 podStartE2EDuration="7.332285064s" podCreationTimestamp="2026-02-23 06:54:30 +0000 UTC" firstStartedPulling="2026-02-23 06:54:31.563974862 +0000 UTC m=+823.903304128" lastFinishedPulling="2026-02-23 06:54:36.462871709 +0000 UTC m=+828.802200975" observedRunningTime="2026-02-23 06:54:37.327761074 +0000 UTC m=+829.667090340" watchObservedRunningTime="2026-02-23 06:54:37.332285064 +0000 UTC m=+829.671614329" Feb 23 06:54:37 crc kubenswrapper[4626]: I0223 06:54:37.401192 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5l5lw"] Feb 23 06:54:38 crc kubenswrapper[4626]: I0223 06:54:38.276443 4626 generic.go:334] "Generic (PLEG): container finished" podID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerID="435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd" exitCode=0 Feb 23 06:54:38 crc kubenswrapper[4626]: I0223 06:54:38.276552 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerDied","Data":"435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd"} Feb 23 06:54:38 crc kubenswrapper[4626]: I0223 06:54:38.276892 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerStarted","Data":"dbb7464ffaf3893fb4040a06be2f23e9759f1f7754ad1ac3cff507851a707a7d"} Feb 23 06:54:39 crc kubenswrapper[4626]: I0223 06:54:39.283668 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerStarted","Data":"9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7"} Feb 23 06:54:40 crc kubenswrapper[4626]: I0223 06:54:40.288882 4626 generic.go:334] "Generic (PLEG): container finished" podID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerID="9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7" exitCode=0 Feb 23 06:54:40 crc kubenswrapper[4626]: I0223 06:54:40.288923 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerDied","Data":"9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7"} Feb 23 06:54:41 crc kubenswrapper[4626]: I0223 06:54:41.299796 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerStarted","Data":"97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05"} Feb 23 06:54:41 crc kubenswrapper[4626]: I0223 06:54:41.314297 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5l5lw" podStartSLOduration=2.762411642 podStartE2EDuration="5.314278458s" podCreationTimestamp="2026-02-23 06:54:36 +0000 UTC" firstStartedPulling="2026-02-23 06:54:38.278076774 +0000 UTC m=+830.617406040" lastFinishedPulling="2026-02-23 06:54:40.82994359 +0000 UTC m=+833.169272856" observedRunningTime="2026-02-23 06:54:41.312788949 +0000 UTC m=+833.652118215" watchObservedRunningTime="2026-02-23 06:54:41.314278458 +0000 UTC m=+833.653607713" Feb 23 06:54:47 crc kubenswrapper[4626]: I0223 06:54:47.160034 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:47 crc kubenswrapper[4626]: I0223 06:54:47.160703 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:47 crc kubenswrapper[4626]: I0223 06:54:47.195759 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:47 crc kubenswrapper[4626]: I0223 06:54:47.375151 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:49 crc kubenswrapper[4626]: I0223 06:54:49.426449 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5l5lw"] Feb 23 06:54:49 crc kubenswrapper[4626]: I0223 06:54:49.427558 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5l5lw" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="registry-server" containerID="cri-o://97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05" gracePeriod=2 Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.255433 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.355193 4626 generic.go:334] "Generic (PLEG): container finished" podID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerID="97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05" exitCode=0 Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.355247 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerDied","Data":"97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05"} Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.355281 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5l5lw" event={"ID":"a9b5dede-e7bc-455d-8b99-8e5c0aeac491","Type":"ContainerDied","Data":"dbb7464ffaf3893fb4040a06be2f23e9759f1f7754ad1ac3cff507851a707a7d"} Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.355302 4626 scope.go:117] "RemoveContainer" containerID="97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.355442 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5l5lw" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.371015 4626 scope.go:117] "RemoveContainer" containerID="9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.401922 4626 scope.go:117] "RemoveContainer" containerID="435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.409978 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-utilities\") pod \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.410207 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7smj\" (UniqueName: \"kubernetes.io/projected/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-kube-api-access-c7smj\") pod \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.410359 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-catalog-content\") pod \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\" (UID: \"a9b5dede-e7bc-455d-8b99-8e5c0aeac491\") " Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.411997 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-utilities" (OuterVolumeSpecName: "utilities") pod "a9b5dede-e7bc-455d-8b99-8e5c0aeac491" (UID: "a9b5dede-e7bc-455d-8b99-8e5c0aeac491"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.418716 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-kube-api-access-c7smj" (OuterVolumeSpecName: "kube-api-access-c7smj") pod "a9b5dede-e7bc-455d-8b99-8e5c0aeac491" (UID: "a9b5dede-e7bc-455d-8b99-8e5c0aeac491"). InnerVolumeSpecName "kube-api-access-c7smj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.423593 4626 scope.go:117] "RemoveContainer" containerID="97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05" Feb 23 06:54:50 crc kubenswrapper[4626]: E0223 06:54:50.425220 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05\": container with ID starting with 97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05 not found: ID does not exist" containerID="97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.425254 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05"} err="failed to get container status \"97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05\": rpc error: code = NotFound desc = could not find container \"97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05\": container with ID starting with 97fdc41200e2e33da1b371fb5f1772d36b345fbd8c6aab9bec51f82f54797b05 not found: ID does not exist" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.425276 4626 scope.go:117] "RemoveContainer" containerID="9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7" Feb 23 06:54:50 crc kubenswrapper[4626]: E0223 06:54:50.425542 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7\": container with ID starting with 9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7 not found: ID does not exist" containerID="9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.425562 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7"} err="failed to get container status \"9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7\": rpc error: code = NotFound desc = could not find container \"9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7\": container with ID starting with 9f4a54c73081ffdc2c83b83fa9df37cf11b60cd3840a34a55d3a71066e23f5c7 not found: ID does not exist" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.425576 4626 scope.go:117] "RemoveContainer" containerID="435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd" Feb 23 06:54:50 crc kubenswrapper[4626]: E0223 06:54:50.425831 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd\": container with ID starting with 435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd not found: ID does not exist" containerID="435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.425850 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd"} err="failed to get container status \"435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd\": rpc error: code = NotFound desc = could not find container \"435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd\": container with ID starting with 435fc16e3c21308c40cfbc832d72319727ddb0501890ae3764c6ec3a0c341acd not found: ID does not exist" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.454058 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9b5dede-e7bc-455d-8b99-8e5c0aeac491" (UID: "a9b5dede-e7bc-455d-8b99-8e5c0aeac491"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.513129 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7smj\" (UniqueName: \"kubernetes.io/projected/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-kube-api-access-c7smj\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.513226 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.513292 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b5dede-e7bc-455d-8b99-8e5c0aeac491-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.681260 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5l5lw"] Feb 23 06:54:50 crc kubenswrapper[4626]: I0223 06:54:50.687121 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5l5lw"] Feb 23 06:54:51 crc kubenswrapper[4626]: I0223 06:54:51.187872 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b49bc4cb8-dxzbr" Feb 23 06:54:51 crc kubenswrapper[4626]: I0223 06:54:51.990462 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" path="/var/lib/kubelet/pods/a9b5dede-e7bc-455d-8b99-8e5c0aeac491/volumes" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.832073 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wm9b6"] Feb 23 06:54:55 crc kubenswrapper[4626]: E0223 06:54:55.832563 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="extract-utilities" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.832578 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="extract-utilities" Feb 23 06:54:55 crc kubenswrapper[4626]: E0223 06:54:55.832587 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="registry-server" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.832593 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="registry-server" Feb 23 06:54:55 crc kubenswrapper[4626]: E0223 06:54:55.832608 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="extract-content" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.832614 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="extract-content" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.832730 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b5dede-e7bc-455d-8b99-8e5c0aeac491" containerName="registry-server" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.833438 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.845395 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wm9b6"] Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.988793 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-utilities\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.988824 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-catalog-content\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:55 crc kubenswrapper[4626]: I0223 06:54:55.988852 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmrm\" (UniqueName: \"kubernetes.io/projected/44774a33-6404-4321-936e-9a8ca31038a7-kube-api-access-5wmrm\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.090297 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-utilities\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.090351 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-catalog-content\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.090384 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmrm\" (UniqueName: \"kubernetes.io/projected/44774a33-6404-4321-936e-9a8ca31038a7-kube-api-access-5wmrm\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.090794 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-utilities\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.090995 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-catalog-content\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.110254 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmrm\" (UniqueName: \"kubernetes.io/projected/44774a33-6404-4321-936e-9a8ca31038a7-kube-api-access-5wmrm\") pod \"community-operators-wm9b6\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.146711 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:54:56 crc kubenswrapper[4626]: I0223 06:54:56.428069 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wm9b6"] Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.398575 4626 generic.go:334] "Generic (PLEG): container finished" podID="44774a33-6404-4321-936e-9a8ca31038a7" containerID="b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2" exitCode=0 Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.398693 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm9b6" event={"ID":"44774a33-6404-4321-936e-9a8ca31038a7","Type":"ContainerDied","Data":"b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2"} Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.398904 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm9b6" event={"ID":"44774a33-6404-4321-936e-9a8ca31038a7","Type":"ContainerStarted","Data":"6e75c9873ddb28fe3afa4a6c09146aa5967b499df018e75a9cf830eee0fa85dd"} Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.437123 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2v9cq"] Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.438659 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.461568 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v9cq"] Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.509315 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-utilities\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.509390 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-catalog-content\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.509419 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvp8r\" (UniqueName: \"kubernetes.io/projected/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-kube-api-access-vvp8r\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.611146 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-utilities\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.611641 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-catalog-content\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.611668 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvp8r\" (UniqueName: \"kubernetes.io/projected/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-kube-api-access-vvp8r\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.611892 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-utilities\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.612213 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-catalog-content\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.630310 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvp8r\" (UniqueName: \"kubernetes.io/projected/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-kube-api-access-vvp8r\") pod \"redhat-marketplace-2v9cq\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:57 crc kubenswrapper[4626]: I0223 06:54:57.762484 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:54:58 crc kubenswrapper[4626]: I0223 06:54:58.161284 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v9cq"] Feb 23 06:54:58 crc kubenswrapper[4626]: W0223 06:54:58.167709 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f3c3ff_ddb1_48cd_8c19_84d9d4455d78.slice/crio-4f6acd34c72efe701f66b3f303d981576c03021818a8239566a03d4e48fd7aed WatchSource:0}: Error finding container 4f6acd34c72efe701f66b3f303d981576c03021818a8239566a03d4e48fd7aed: Status 404 returned error can't find the container with id 4f6acd34c72efe701f66b3f303d981576c03021818a8239566a03d4e48fd7aed Feb 23 06:54:58 crc kubenswrapper[4626]: I0223 06:54:58.408710 4626 generic.go:334] "Generic (PLEG): container finished" podID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerID="208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d" exitCode=0 Feb 23 06:54:58 crc kubenswrapper[4626]: I0223 06:54:58.408821 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v9cq" event={"ID":"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78","Type":"ContainerDied","Data":"208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d"} Feb 23 06:54:58 crc kubenswrapper[4626]: I0223 06:54:58.408898 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v9cq" event={"ID":"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78","Type":"ContainerStarted","Data":"4f6acd34c72efe701f66b3f303d981576c03021818a8239566a03d4e48fd7aed"} Feb 23 06:54:58 crc kubenswrapper[4626]: I0223 06:54:58.412314 4626 generic.go:334] "Generic (PLEG): container finished" podID="44774a33-6404-4321-936e-9a8ca31038a7" containerID="9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11" exitCode=0 Feb 23 06:54:58 crc kubenswrapper[4626]: I0223 06:54:58.412388 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm9b6" event={"ID":"44774a33-6404-4321-936e-9a8ca31038a7","Type":"ContainerDied","Data":"9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11"} Feb 23 06:54:59 crc kubenswrapper[4626]: I0223 06:54:59.421882 4626 generic.go:334] "Generic (PLEG): container finished" podID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerID="c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887" exitCode=0 Feb 23 06:54:59 crc kubenswrapper[4626]: I0223 06:54:59.422247 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v9cq" event={"ID":"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78","Type":"ContainerDied","Data":"c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887"} Feb 23 06:54:59 crc kubenswrapper[4626]: I0223 06:54:59.425975 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm9b6" event={"ID":"44774a33-6404-4321-936e-9a8ca31038a7","Type":"ContainerStarted","Data":"f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa"} Feb 23 06:54:59 crc kubenswrapper[4626]: I0223 06:54:59.462124 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wm9b6" podStartSLOduration=2.979482765 podStartE2EDuration="4.462098256s" podCreationTimestamp="2026-02-23 06:54:55 +0000 UTC" firstStartedPulling="2026-02-23 06:54:57.400455018 +0000 UTC m=+849.739784284" lastFinishedPulling="2026-02-23 06:54:58.883070509 +0000 UTC m=+851.222399775" observedRunningTime="2026-02-23 06:54:59.461691679 +0000 UTC m=+851.801020935" watchObservedRunningTime="2026-02-23 06:54:59.462098256 +0000 UTC m=+851.801427522" Feb 23 06:55:00 crc kubenswrapper[4626]: I0223 06:55:00.433024 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v9cq" event={"ID":"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78","Type":"ContainerStarted","Data":"449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824"} Feb 23 06:55:00 crc kubenswrapper[4626]: I0223 06:55:00.456173 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2v9cq" podStartSLOduration=2.005299558 podStartE2EDuration="3.456158046s" podCreationTimestamp="2026-02-23 06:54:57 +0000 UTC" firstStartedPulling="2026-02-23 06:54:58.410726539 +0000 UTC m=+850.750055805" lastFinishedPulling="2026-02-23 06:54:59.861585026 +0000 UTC m=+852.200914293" observedRunningTime="2026-02-23 06:55:00.451706396 +0000 UTC m=+852.791035661" watchObservedRunningTime="2026-02-23 06:55:00.456158046 +0000 UTC m=+852.795487312" Feb 23 06:55:06 crc kubenswrapper[4626]: I0223 06:55:06.147819 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:55:06 crc kubenswrapper[4626]: I0223 06:55:06.148484 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:55:06 crc kubenswrapper[4626]: I0223 06:55:06.184754 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:55:06 crc kubenswrapper[4626]: I0223 06:55:06.504771 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:55:07 crc kubenswrapper[4626]: I0223 06:55:07.762850 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:55:07 crc kubenswrapper[4626]: I0223 06:55:07.762920 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:55:07 crc kubenswrapper[4626]: I0223 06:55:07.797279 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:55:08 crc kubenswrapper[4626]: I0223 06:55:08.509045 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.032488 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wm9b6"] Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.032754 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wm9b6" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="registry-server" containerID="cri-o://f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa" gracePeriod=2 Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.365446 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.475049 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wmrm\" (UniqueName: \"kubernetes.io/projected/44774a33-6404-4321-936e-9a8ca31038a7-kube-api-access-5wmrm\") pod \"44774a33-6404-4321-936e-9a8ca31038a7\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.475166 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-catalog-content\") pod \"44774a33-6404-4321-936e-9a8ca31038a7\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.475273 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-utilities\") pod \"44774a33-6404-4321-936e-9a8ca31038a7\" (UID: \"44774a33-6404-4321-936e-9a8ca31038a7\") " Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.476086 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-utilities" (OuterVolumeSpecName: "utilities") pod "44774a33-6404-4321-936e-9a8ca31038a7" (UID: "44774a33-6404-4321-936e-9a8ca31038a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.480776 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44774a33-6404-4321-936e-9a8ca31038a7-kube-api-access-5wmrm" (OuterVolumeSpecName: "kube-api-access-5wmrm") pod "44774a33-6404-4321-936e-9a8ca31038a7" (UID: "44774a33-6404-4321-936e-9a8ca31038a7"). InnerVolumeSpecName "kube-api-access-5wmrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.488439 4626 generic.go:334] "Generic (PLEG): container finished" podID="44774a33-6404-4321-936e-9a8ca31038a7" containerID="f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa" exitCode=0 Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.488613 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm9b6" event={"ID":"44774a33-6404-4321-936e-9a8ca31038a7","Type":"ContainerDied","Data":"f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa"} Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.488667 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wm9b6" event={"ID":"44774a33-6404-4321-936e-9a8ca31038a7","Type":"ContainerDied","Data":"6e75c9873ddb28fe3afa4a6c09146aa5967b499df018e75a9cf830eee0fa85dd"} Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.488689 4626 scope.go:117] "RemoveContainer" containerID="f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.488691 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wm9b6" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.503622 4626 scope.go:117] "RemoveContainer" containerID="9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.519709 4626 scope.go:117] "RemoveContainer" containerID="b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.519866 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44774a33-6404-4321-936e-9a8ca31038a7" (UID: "44774a33-6404-4321-936e-9a8ca31038a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.533416 4626 scope.go:117] "RemoveContainer" containerID="f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa" Feb 23 06:55:09 crc kubenswrapper[4626]: E0223 06:55:09.533814 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa\": container with ID starting with f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa not found: ID does not exist" containerID="f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.533854 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa"} err="failed to get container status \"f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa\": rpc error: code = NotFound desc = could not find container \"f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa\": container with ID starting with f599232a075584123da442666b641c86a3d84dc40227f6f5048af4f736f936aa not found: ID does not exist" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.533902 4626 scope.go:117] "RemoveContainer" containerID="9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11" Feb 23 06:55:09 crc kubenswrapper[4626]: E0223 06:55:09.534196 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11\": container with ID starting with 9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11 not found: ID does not exist" containerID="9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.534226 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11"} err="failed to get container status \"9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11\": rpc error: code = NotFound desc = could not find container \"9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11\": container with ID starting with 9166ca2aeeb731c82b0f3cc255d3dbe08fab5c87e370201bcff04eb32c104d11 not found: ID does not exist" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.534248 4626 scope.go:117] "RemoveContainer" containerID="b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2" Feb 23 06:55:09 crc kubenswrapper[4626]: E0223 06:55:09.534547 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2\": container with ID starting with b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2 not found: ID does not exist" containerID="b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.534593 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2"} err="failed to get container status \"b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2\": rpc error: code = NotFound desc = could not find container \"b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2\": container with ID starting with b5c2241130b1e958d94b11c2e256edd39e5032ca11da141855ed88ca97650fc2 not found: ID does not exist" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.577374 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.577575 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wmrm\" (UniqueName: \"kubernetes.io/projected/44774a33-6404-4321-936e-9a8ca31038a7-kube-api-access-5wmrm\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.577644 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44774a33-6404-4321-936e-9a8ca31038a7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.817404 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wm9b6"] Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.822835 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wm9b6"] Feb 23 06:55:09 crc kubenswrapper[4626]: I0223 06:55:09.988986 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44774a33-6404-4321-936e-9a8ca31038a7" path="/var/lib/kubelet/pods/44774a33-6404-4321-936e-9a8ca31038a7/volumes" Feb 23 06:55:10 crc kubenswrapper[4626]: I0223 06:55:10.953086 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7df8f6cc8-jqkkq" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.493817 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jxvxj"] Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.494327 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="extract-content" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.494346 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="extract-content" Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.494368 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="registry-server" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.494375 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="registry-server" Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.494390 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="extract-utilities" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.494396 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="extract-utilities" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.494532 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="44774a33-6404-4321-936e-9a8ca31038a7" containerName="registry-server" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.496370 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.505930 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.506140 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.506831 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-x4zm8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.525827 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h"] Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.527336 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.532616 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.542030 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h"] Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.607646 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-sockets\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.607966 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-metrics\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.608101 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-conf\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.608206 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-metrics-certs\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.608307 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-reloader\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.608405 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-startup\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.608538 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqp6\" (UniqueName: \"kubernetes.io/projected/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-kube-api-access-6jqp6\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.658814 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cnjgx"] Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.659634 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.662103 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.662164 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.663112 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.663346 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-sthxk" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.687653 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-94xt8"] Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.688473 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.690120 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.709903 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqp6\" (UniqueName: \"kubernetes.io/projected/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-kube-api-access-6jqp6\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.709957 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-sockets\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710004 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-metrics\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710042 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlb55\" (UniqueName: \"kubernetes.io/projected/06339724-62db-4cc8-930b-5ce2572b46da-kube-api-access-wlb55\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710079 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-conf\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710108 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-metrics-certs\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710116 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-94xt8"] Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710144 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-reloader\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710168 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-startup\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710183 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06339724-62db-4cc8-930b-5ce2572b46da-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710530 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-sockets\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.710583 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-metrics\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.711221 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-startup\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.714115 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-frr-conf\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.714277 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-reloader\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.726721 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-metrics-certs\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.729911 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqp6\" (UniqueName: \"kubernetes.io/projected/d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd-kube-api-access-6jqp6\") pod \"frr-k8s-jxvxj\" (UID: \"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd\") " pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.811315 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-metrics-certs\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.811387 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52957c0d-2568-459b-a83c-635bbd08c164-cert\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.811419 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlb55\" (UniqueName: \"kubernetes.io/projected/06339724-62db-4cc8-930b-5ce2572b46da-kube-api-access-wlb55\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.811468 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52957c0d-2568-459b-a83c-635bbd08c164-metrics-certs\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.811694 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.811891 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-memberlist\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.812023 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a86df0ed-d52d-4095-aed6-b298542a1c2e-metallb-excludel2\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.812107 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslvz\" (UniqueName: \"kubernetes.io/projected/a86df0ed-d52d-4095-aed6-b298542a1c2e-kube-api-access-xslvz\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.812190 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06339724-62db-4cc8-930b-5ce2572b46da-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.812379 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8rr\" (UniqueName: \"kubernetes.io/projected/52957c0d-2568-459b-a83c-635bbd08c164-kube-api-access-dj8rr\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.812303 4626 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.812624 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06339724-62db-4cc8-930b-5ce2572b46da-cert podName:06339724-62db-4cc8-930b-5ce2572b46da nodeName:}" failed. No retries permitted until 2026-02-23 06:55:12.31260231 +0000 UTC m=+864.651931576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06339724-62db-4cc8-930b-5ce2572b46da-cert") pod "frr-k8s-webhook-server-78b44bf5bb-tvj6h" (UID: "06339724-62db-4cc8-930b-5ce2572b46da") : secret "frr-k8s-webhook-server-cert" not found Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.845530 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlb55\" (UniqueName: \"kubernetes.io/projected/06339724-62db-4cc8-930b-5ce2572b46da-kube-api-access-wlb55\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914352 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-memberlist\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914433 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a86df0ed-d52d-4095-aed6-b298542a1c2e-metallb-excludel2\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914461 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xslvz\" (UniqueName: \"kubernetes.io/projected/a86df0ed-d52d-4095-aed6-b298542a1c2e-kube-api-access-xslvz\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914576 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8rr\" (UniqueName: \"kubernetes.io/projected/52957c0d-2568-459b-a83c-635bbd08c164-kube-api-access-dj8rr\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914609 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-metrics-certs\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914642 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52957c0d-2568-459b-a83c-635bbd08c164-cert\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.914708 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52957c0d-2568-459b-a83c-635bbd08c164-metrics-certs\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.915204 4626 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 06:55:11 crc kubenswrapper[4626]: E0223 06:55:11.915348 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-memberlist podName:a86df0ed-d52d-4095-aed6-b298542a1c2e nodeName:}" failed. No retries permitted until 2026-02-23 06:55:12.415312035 +0000 UTC m=+864.754641301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-memberlist") pod "speaker-cnjgx" (UID: "a86df0ed-d52d-4095-aed6-b298542a1c2e") : secret "metallb-memberlist" not found Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.915487 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a86df0ed-d52d-4095-aed6-b298542a1c2e-metallb-excludel2\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.918220 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-metrics-certs\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.918516 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52957c0d-2568-459b-a83c-635bbd08c164-metrics-certs\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.920065 4626 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.927964 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52957c0d-2568-459b-a83c-635bbd08c164-cert\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.931059 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslvz\" (UniqueName: \"kubernetes.io/projected/a86df0ed-d52d-4095-aed6-b298542a1c2e-kube-api-access-xslvz\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.938958 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8rr\" (UniqueName: \"kubernetes.io/projected/52957c0d-2568-459b-a83c-635bbd08c164-kube-api-access-dj8rr\") pod \"controller-69bbfbf88f-94xt8\" (UID: \"52957c0d-2568-459b-a83c-635bbd08c164\") " pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:11 crc kubenswrapper[4626]: I0223 06:55:11.999420 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.172659 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-94xt8"] Feb 23 06:55:12 crc kubenswrapper[4626]: W0223 06:55:12.179739 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52957c0d_2568_459b_a83c_635bbd08c164.slice/crio-5019d4bfce33d1ebe9ecde5057a33fe19b80a1ee5f6ee831bd88319af9950093 WatchSource:0}: Error finding container 5019d4bfce33d1ebe9ecde5057a33fe19b80a1ee5f6ee831bd88319af9950093: Status 404 returned error can't find the container with id 5019d4bfce33d1ebe9ecde5057a33fe19b80a1ee5f6ee831bd88319af9950093 Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.319987 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06339724-62db-4cc8-930b-5ce2572b46da-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.325345 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06339724-62db-4cc8-930b-5ce2572b46da-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-tvj6h\" (UID: \"06339724-62db-4cc8-930b-5ce2572b46da\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.421599 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-memberlist\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.425556 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a86df0ed-d52d-4095-aed6-b298542a1c2e-memberlist\") pod \"speaker-cnjgx\" (UID: \"a86df0ed-d52d-4095-aed6-b298542a1c2e\") " pod="metallb-system/speaker-cnjgx" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.441917 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.511473 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"ece537aed4d8ce5553fb59167654f547cf497844e6df2418fe919c8775165758"} Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.519224 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-94xt8" event={"ID":"52957c0d-2568-459b-a83c-635bbd08c164","Type":"ContainerStarted","Data":"046d7b5028223da7a3d42a6229781de612165b5ec8ae5870e9c4d9c0e3473eb8"} Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.519246 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-94xt8" event={"ID":"52957c0d-2568-459b-a83c-635bbd08c164","Type":"ContainerStarted","Data":"3cddea7a361dc288cf87e30d4c463af25ce268c2905afa18bd71e98a6f82f363"} Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.519257 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-94xt8" event={"ID":"52957c0d-2568-459b-a83c-635bbd08c164","Type":"ContainerStarted","Data":"5019d4bfce33d1ebe9ecde5057a33fe19b80a1ee5f6ee831bd88319af9950093"} Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.520591 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.546741 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-94xt8" podStartSLOduration=1.546722376 podStartE2EDuration="1.546722376s" podCreationTimestamp="2026-02-23 06:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:55:12.546139767 +0000 UTC m=+864.885469034" watchObservedRunningTime="2026-02-23 06:55:12.546722376 +0000 UTC m=+864.886051643" Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.572657 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cnjgx" Feb 23 06:55:12 crc kubenswrapper[4626]: W0223 06:55:12.592005 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86df0ed_d52d_4095_aed6_b298542a1c2e.slice/crio-3a71b3c13992187d24129f5250804073ef55ab615cfedc53b81c9a68101e1899 WatchSource:0}: Error finding container 3a71b3c13992187d24129f5250804073ef55ab615cfedc53b81c9a68101e1899: Status 404 returned error can't find the container with id 3a71b3c13992187d24129f5250804073ef55ab615cfedc53b81c9a68101e1899 Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.628088 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v9cq"] Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.628468 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2v9cq" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="registry-server" containerID="cri-o://449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824" gracePeriod=2 Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.838394 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h"] Feb 23 06:55:12 crc kubenswrapper[4626]: W0223 06:55:12.849678 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06339724_62db_4cc8_930b_5ce2572b46da.slice/crio-9d47a02231c7978f566b84a2f886b633b2872b3455711c399be02d812862a48d WatchSource:0}: Error finding container 9d47a02231c7978f566b84a2f886b633b2872b3455711c399be02d812862a48d: Status 404 returned error can't find the container with id 9d47a02231c7978f566b84a2f886b633b2872b3455711c399be02d812862a48d Feb 23 06:55:12 crc kubenswrapper[4626]: I0223 06:55:12.987327 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.137583 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvp8r\" (UniqueName: \"kubernetes.io/projected/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-kube-api-access-vvp8r\") pod \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.137865 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-utilities\") pod \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.137940 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-catalog-content\") pod \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\" (UID: \"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78\") " Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.138780 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-utilities" (OuterVolumeSpecName: "utilities") pod "98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" (UID: "98f3c3ff-ddb1-48cd-8c19-84d9d4455d78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.143295 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-kube-api-access-vvp8r" (OuterVolumeSpecName: "kube-api-access-vvp8r") pod "98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" (UID: "98f3c3ff-ddb1-48cd-8c19-84d9d4455d78"). InnerVolumeSpecName "kube-api-access-vvp8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.162445 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" (UID: "98f3c3ff-ddb1-48cd-8c19-84d9d4455d78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.239615 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.239639 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.239649 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvp8r\" (UniqueName: \"kubernetes.io/projected/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78-kube-api-access-vvp8r\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.529093 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cnjgx" event={"ID":"a86df0ed-d52d-4095-aed6-b298542a1c2e","Type":"ContainerStarted","Data":"2ddfec61289b9c8c8c452f039f4a9c9cca29518d4ebebdcf50163b12daedacd8"} Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.529557 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cnjgx" event={"ID":"a86df0ed-d52d-4095-aed6-b298542a1c2e","Type":"ContainerStarted","Data":"964adb45f37b9c857e6ef9c4b6ce343393fe91250cc92c2f25cc6dbf1aa86b19"} Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.529573 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cnjgx" event={"ID":"a86df0ed-d52d-4095-aed6-b298542a1c2e","Type":"ContainerStarted","Data":"3a71b3c13992187d24129f5250804073ef55ab615cfedc53b81c9a68101e1899"} Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.529778 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cnjgx" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.532021 4626 generic.go:334] "Generic (PLEG): container finished" podID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerID="449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824" exitCode=0 Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.532061 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v9cq" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.532148 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v9cq" event={"ID":"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78","Type":"ContainerDied","Data":"449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824"} Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.532221 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v9cq" event={"ID":"98f3c3ff-ddb1-48cd-8c19-84d9d4455d78","Type":"ContainerDied","Data":"4f6acd34c72efe701f66b3f303d981576c03021818a8239566a03d4e48fd7aed"} Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.532248 4626 scope.go:117] "RemoveContainer" containerID="449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.533665 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" event={"ID":"06339724-62db-4cc8-930b-5ce2572b46da","Type":"ContainerStarted","Data":"9d47a02231c7978f566b84a2f886b633b2872b3455711c399be02d812862a48d"} Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.551902 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cnjgx" podStartSLOduration=2.5518782829999997 podStartE2EDuration="2.551878283s" podCreationTimestamp="2026-02-23 06:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:55:13.546466182 +0000 UTC m=+865.885795448" watchObservedRunningTime="2026-02-23 06:55:13.551878283 +0000 UTC m=+865.891207539" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.575720 4626 scope.go:117] "RemoveContainer" containerID="c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.583368 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v9cq"] Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.590707 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v9cq"] Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.622187 4626 scope.go:117] "RemoveContainer" containerID="208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.635597 4626 scope.go:117] "RemoveContainer" containerID="449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824" Feb 23 06:55:13 crc kubenswrapper[4626]: E0223 06:55:13.635972 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824\": container with ID starting with 449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824 not found: ID does not exist" containerID="449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.636038 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824"} err="failed to get container status \"449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824\": rpc error: code = NotFound desc = could not find container \"449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824\": container with ID starting with 449db01e4f43ba9413cd3110a8c5658b65040569b1f612b7c100f022ec123824 not found: ID does not exist" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.636063 4626 scope.go:117] "RemoveContainer" containerID="c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887" Feb 23 06:55:13 crc kubenswrapper[4626]: E0223 06:55:13.636437 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887\": container with ID starting with c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887 not found: ID does not exist" containerID="c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.636486 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887"} err="failed to get container status \"c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887\": rpc error: code = NotFound desc = could not find container \"c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887\": container with ID starting with c722066aeeea86b22843278cdee3248d39a6a410d889e054c9dcf4b8c6752887 not found: ID does not exist" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.636528 4626 scope.go:117] "RemoveContainer" containerID="208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d" Feb 23 06:55:13 crc kubenswrapper[4626]: E0223 06:55:13.636791 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d\": container with ID starting with 208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d not found: ID does not exist" containerID="208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.636865 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d"} err="failed to get container status \"208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d\": rpc error: code = NotFound desc = could not find container \"208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d\": container with ID starting with 208e0742e523ed32b477da05967c2501fe0d1e421452ef186ac5c0dc1814053d not found: ID does not exist" Feb 23 06:55:13 crc kubenswrapper[4626]: I0223 06:55:13.997016 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" path="/var/lib/kubelet/pods/98f3c3ff-ddb1-48cd-8c19-84d9d4455d78/volumes" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.586576 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" event={"ID":"06339724-62db-4cc8-930b-5ce2572b46da","Type":"ContainerStarted","Data":"aa615f09670f1f9a74c8d3c46d3c5554290c6d6dad7037f678fc89372478c616"} Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.587896 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.588404 4626 generic.go:334] "Generic (PLEG): container finished" podID="d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd" containerID="5396fc2548aefc1a46fd7e6990c62670a166a7dbf0de2e38d16ab19b819547fb" exitCode=0 Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.588459 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerDied","Data":"5396fc2548aefc1a46fd7e6990c62670a166a7dbf0de2e38d16ab19b819547fb"} Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.603148 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" podStartSLOduration=2.20854089 podStartE2EDuration="8.603129054s" podCreationTimestamp="2026-02-23 06:55:11 +0000 UTC" firstStartedPulling="2026-02-23 06:55:12.854478788 +0000 UTC m=+865.193808054" lastFinishedPulling="2026-02-23 06:55:19.249066952 +0000 UTC m=+871.588396218" observedRunningTime="2026-02-23 06:55:19.602168543 +0000 UTC m=+871.941497810" watchObservedRunningTime="2026-02-23 06:55:19.603129054 +0000 UTC m=+871.942458320" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.645519 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2c8dv"] Feb 23 06:55:19 crc kubenswrapper[4626]: E0223 06:55:19.645803 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="extract-utilities" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.645828 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="extract-utilities" Feb 23 06:55:19 crc kubenswrapper[4626]: E0223 06:55:19.645842 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="registry-server" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.645850 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="registry-server" Feb 23 06:55:19 crc kubenswrapper[4626]: E0223 06:55:19.645866 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="extract-content" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.645873 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="extract-content" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.646004 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f3c3ff-ddb1-48cd-8c19-84d9d4455d78" containerName="registry-server" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.646975 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.656657 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2c8dv"] Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.754778 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-catalog-content\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.755287 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdm4d\" (UniqueName: \"kubernetes.io/projected/812f7ca1-9cd5-42dd-b69f-28090507fce0-kube-api-access-kdm4d\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.755366 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-utilities\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.856827 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdm4d\" (UniqueName: \"kubernetes.io/projected/812f7ca1-9cd5-42dd-b69f-28090507fce0-kube-api-access-kdm4d\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.856899 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-utilities\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.857023 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-catalog-content\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.857488 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-utilities\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.857576 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-catalog-content\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.883597 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdm4d\" (UniqueName: \"kubernetes.io/projected/812f7ca1-9cd5-42dd-b69f-28090507fce0-kube-api-access-kdm4d\") pod \"redhat-operators-2c8dv\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:19 crc kubenswrapper[4626]: I0223 06:55:19.986564 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:20 crc kubenswrapper[4626]: I0223 06:55:20.226403 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2c8dv"] Feb 23 06:55:20 crc kubenswrapper[4626]: I0223 06:55:20.598142 4626 generic.go:334] "Generic (PLEG): container finished" podID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerID="42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3" exitCode=0 Feb 23 06:55:20 crc kubenswrapper[4626]: I0223 06:55:20.598258 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerDied","Data":"42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3"} Feb 23 06:55:20 crc kubenswrapper[4626]: I0223 06:55:20.598631 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerStarted","Data":"012c19314e61409462767bfd316be9a0faca18994e46cdfd44090db63fa06eee"} Feb 23 06:55:20 crc kubenswrapper[4626]: I0223 06:55:20.601629 4626 generic.go:334] "Generic (PLEG): container finished" podID="d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd" containerID="5232e8771cd1546320b3fdb08b88741ed9511f7b66816229fd9951fa0ea54982" exitCode=0 Feb 23 06:55:20 crc kubenswrapper[4626]: I0223 06:55:20.601705 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerDied","Data":"5232e8771cd1546320b3fdb08b88741ed9511f7b66816229fd9951fa0ea54982"} Feb 23 06:55:21 crc kubenswrapper[4626]: I0223 06:55:21.609997 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerStarted","Data":"436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60"} Feb 23 06:55:21 crc kubenswrapper[4626]: I0223 06:55:21.613933 4626 generic.go:334] "Generic (PLEG): container finished" podID="d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd" containerID="deea4442aa21eb78f7f9650043878eb4ed5fb760fd8c6bb4e4c8490e0ae5fd09" exitCode=0 Feb 23 06:55:21 crc kubenswrapper[4626]: I0223 06:55:21.613972 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerDied","Data":"deea4442aa21eb78f7f9650043878eb4ed5fb760fd8c6bb4e4c8490e0ae5fd09"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.576478 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cnjgx" Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.625912 4626 generic.go:334] "Generic (PLEG): container finished" podID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerID="436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60" exitCode=0 Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.626035 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerDied","Data":"436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.632727 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"478245c213ba5a818bcfea9bb9457e44db49744637a7eb2166941c28e738f9ef"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.632786 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"6fca980ef4a1d24616ee9c34b6b80d2fb4c5f84ea4ec68d97c1b4f0310f05664"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.632802 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"00ad246acf8a2555513cf2a93c51daddb41d9405b1c889d92fe5357afe7dc2bc"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.632814 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"3fcd05328e8c50644d09024ad73a50cf55f1de31a7f735ff43983167c334ac69"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.632824 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"3495fc19a05f136705952282afcc2fb604fb85881906065a0a9f0098967c7aad"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.632834 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jxvxj" event={"ID":"d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd","Type":"ContainerStarted","Data":"3e17d615d9ef01586969ef4dc996081f6ae368aa599d314ce1ec836de5c4b5ba"} Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.633552 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:22 crc kubenswrapper[4626]: I0223 06:55:22.666688 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jxvxj" podStartSLOduration=4.353266043 podStartE2EDuration="11.666662623s" podCreationTimestamp="2026-02-23 06:55:11 +0000 UTC" firstStartedPulling="2026-02-23 06:55:11.934328252 +0000 UTC m=+864.273657518" lastFinishedPulling="2026-02-23 06:55:19.247724842 +0000 UTC m=+871.587054098" observedRunningTime="2026-02-23 06:55:22.661109925 +0000 UTC m=+875.000439191" watchObservedRunningTime="2026-02-23 06:55:22.666662623 +0000 UTC m=+875.005991879" Feb 23 06:55:23 crc kubenswrapper[4626]: I0223 06:55:23.643887 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerStarted","Data":"33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b"} Feb 23 06:55:23 crc kubenswrapper[4626]: I0223 06:55:23.668336 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2c8dv" podStartSLOduration=2.142104857 podStartE2EDuration="4.668313192s" podCreationTimestamp="2026-02-23 06:55:19 +0000 UTC" firstStartedPulling="2026-02-23 06:55:20.599934269 +0000 UTC m=+872.939263525" lastFinishedPulling="2026-02-23 06:55:23.126142604 +0000 UTC m=+875.465471860" observedRunningTime="2026-02-23 06:55:23.665293641 +0000 UTC m=+876.004622907" watchObservedRunningTime="2026-02-23 06:55:23.668313192 +0000 UTC m=+876.007642458" Feb 23 06:55:26 crc kubenswrapper[4626]: I0223 06:55:26.812882 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:26 crc kubenswrapper[4626]: I0223 06:55:26.847654 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.049369 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dqtlp"] Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.050055 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.052907 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.053683 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cpxrf" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.054348 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.092799 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dqtlp"] Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.095359 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhgb\" (UniqueName: \"kubernetes.io/projected/0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7-kube-api-access-dkhgb\") pod \"openstack-operator-index-dqtlp\" (UID: \"0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7\") " pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.196412 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhgb\" (UniqueName: \"kubernetes.io/projected/0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7-kube-api-access-dkhgb\") pod \"openstack-operator-index-dqtlp\" (UID: \"0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7\") " pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.216254 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhgb\" (UniqueName: \"kubernetes.io/projected/0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7-kube-api-access-dkhgb\") pod \"openstack-operator-index-dqtlp\" (UID: \"0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7\") " pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.381585 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.567434 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dqtlp"] Feb 23 06:55:28 crc kubenswrapper[4626]: I0223 06:55:28.680188 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dqtlp" event={"ID":"0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7","Type":"ContainerStarted","Data":"3da442776b6a7485e25781aa8ca1afe52845fe409a65507636be1632d6e9de0d"} Feb 23 06:55:29 crc kubenswrapper[4626]: I0223 06:55:29.691037 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dqtlp" event={"ID":"0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7","Type":"ContainerStarted","Data":"78b2aa54eaa287c8bc0cb6892b3858154b138b80672e666ac1f421b8b195e697"} Feb 23 06:55:29 crc kubenswrapper[4626]: I0223 06:55:29.706869 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dqtlp" podStartSLOduration=0.784477201 podStartE2EDuration="1.706845047s" podCreationTimestamp="2026-02-23 06:55:28 +0000 UTC" firstStartedPulling="2026-02-23 06:55:28.574460083 +0000 UTC m=+880.913789349" lastFinishedPulling="2026-02-23 06:55:29.496827938 +0000 UTC m=+881.836157195" observedRunningTime="2026-02-23 06:55:29.704574326 +0000 UTC m=+882.043903581" watchObservedRunningTime="2026-02-23 06:55:29.706845047 +0000 UTC m=+882.046174312" Feb 23 06:55:29 crc kubenswrapper[4626]: I0223 06:55:29.990819 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:29 crc kubenswrapper[4626]: I0223 06:55:29.990873 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:30 crc kubenswrapper[4626]: I0223 06:55:30.026421 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:30 crc kubenswrapper[4626]: I0223 06:55:30.731979 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:31 crc kubenswrapper[4626]: I0223 06:55:31.819949 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jxvxj" Feb 23 06:55:32 crc kubenswrapper[4626]: I0223 06:55:32.004599 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-94xt8" Feb 23 06:55:32 crc kubenswrapper[4626]: I0223 06:55:32.446225 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-tvj6h" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.228303 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2c8dv"] Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.228888 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2c8dv" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="registry-server" containerID="cri-o://33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b" gracePeriod=2 Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.601867 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.723039 4626 generic.go:334] "Generic (PLEG): container finished" podID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerID="33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b" exitCode=0 Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.723084 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerDied","Data":"33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b"} Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.723114 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2c8dv" event={"ID":"812f7ca1-9cd5-42dd-b69f-28090507fce0","Type":"ContainerDied","Data":"012c19314e61409462767bfd316be9a0faca18994e46cdfd44090db63fa06eee"} Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.723119 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2c8dv" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.723136 4626 scope.go:117] "RemoveContainer" containerID="33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.750759 4626 scope.go:117] "RemoveContainer" containerID="436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.768249 4626 scope.go:117] "RemoveContainer" containerID="42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.778320 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-utilities\") pod \"812f7ca1-9cd5-42dd-b69f-28090507fce0\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.778366 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-catalog-content\") pod \"812f7ca1-9cd5-42dd-b69f-28090507fce0\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.778402 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdm4d\" (UniqueName: \"kubernetes.io/projected/812f7ca1-9cd5-42dd-b69f-28090507fce0-kube-api-access-kdm4d\") pod \"812f7ca1-9cd5-42dd-b69f-28090507fce0\" (UID: \"812f7ca1-9cd5-42dd-b69f-28090507fce0\") " Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.779790 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-utilities" (OuterVolumeSpecName: "utilities") pod "812f7ca1-9cd5-42dd-b69f-28090507fce0" (UID: "812f7ca1-9cd5-42dd-b69f-28090507fce0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.781005 4626 scope.go:117] "RemoveContainer" containerID="33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.784629 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812f7ca1-9cd5-42dd-b69f-28090507fce0-kube-api-access-kdm4d" (OuterVolumeSpecName: "kube-api-access-kdm4d") pod "812f7ca1-9cd5-42dd-b69f-28090507fce0" (UID: "812f7ca1-9cd5-42dd-b69f-28090507fce0"). InnerVolumeSpecName "kube-api-access-kdm4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:55:34 crc kubenswrapper[4626]: E0223 06:55:34.786834 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b\": container with ID starting with 33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b not found: ID does not exist" containerID="33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.786948 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b"} err="failed to get container status \"33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b\": rpc error: code = NotFound desc = could not find container \"33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b\": container with ID starting with 33c86482302ed784e710fe9bbca91a9cfd984fdb1f98c38e8b698c3245fe8d8b not found: ID does not exist" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.786987 4626 scope.go:117] "RemoveContainer" containerID="436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60" Feb 23 06:55:34 crc kubenswrapper[4626]: E0223 06:55:34.787315 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60\": container with ID starting with 436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60 not found: ID does not exist" containerID="436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.787342 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60"} err="failed to get container status \"436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60\": rpc error: code = NotFound desc = could not find container \"436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60\": container with ID starting with 436358d9b0d2360780ef474b16382467c8b28ef69f141110dd57e1df58a6ed60 not found: ID does not exist" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.787356 4626 scope.go:117] "RemoveContainer" containerID="42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3" Feb 23 06:55:34 crc kubenswrapper[4626]: E0223 06:55:34.787709 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3\": container with ID starting with 42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3 not found: ID does not exist" containerID="42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.787757 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3"} err="failed to get container status \"42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3\": rpc error: code = NotFound desc = could not find container \"42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3\": container with ID starting with 42314a4bd7e09c3bfbadab418221f9c3b09590e4795a908976327a9a1a0b51e3 not found: ID does not exist" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.880586 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "812f7ca1-9cd5-42dd-b69f-28090507fce0" (UID: "812f7ca1-9cd5-42dd-b69f-28090507fce0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.881087 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.881180 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812f7ca1-9cd5-42dd-b69f-28090507fce0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:34 crc kubenswrapper[4626]: I0223 06:55:34.881258 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdm4d\" (UniqueName: \"kubernetes.io/projected/812f7ca1-9cd5-42dd-b69f-28090507fce0-kube-api-access-kdm4d\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:35 crc kubenswrapper[4626]: I0223 06:55:35.046678 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2c8dv"] Feb 23 06:55:35 crc kubenswrapper[4626]: I0223 06:55:35.051563 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2c8dv"] Feb 23 06:55:35 crc kubenswrapper[4626]: I0223 06:55:35.990398 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" path="/var/lib/kubelet/pods/812f7ca1-9cd5-42dd-b69f-28090507fce0/volumes" Feb 23 06:55:38 crc kubenswrapper[4626]: I0223 06:55:38.381966 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:38 crc kubenswrapper[4626]: I0223 06:55:38.382367 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:38 crc kubenswrapper[4626]: I0223 06:55:38.411111 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:38 crc kubenswrapper[4626]: I0223 06:55:38.781559 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dqtlp" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.074286 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq"] Feb 23 06:55:40 crc kubenswrapper[4626]: E0223 06:55:40.074701 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="registry-server" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.074721 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="registry-server" Feb 23 06:55:40 crc kubenswrapper[4626]: E0223 06:55:40.074754 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="extract-utilities" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.074761 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="extract-utilities" Feb 23 06:55:40 crc kubenswrapper[4626]: E0223 06:55:40.074775 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="extract-content" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.074782 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="extract-content" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.074939 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="812f7ca1-9cd5-42dd-b69f-28090507fce0" containerName="registry-server" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.076130 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.077965 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mdm9j" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.086065 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq"] Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.155366 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr4c8\" (UniqueName: \"kubernetes.io/projected/57abc151-5a63-4be9-a7e5-eca0e831bdb9-kube-api-access-nr4c8\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.155696 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.155855 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.256714 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.256803 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr4c8\" (UniqueName: \"kubernetes.io/projected/57abc151-5a63-4be9-a7e5-eca0e831bdb9-kube-api-access-nr4c8\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.256879 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.257312 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.257612 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.276670 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr4c8\" (UniqueName: \"kubernetes.io/projected/57abc151-5a63-4be9-a7e5-eca0e831bdb9-kube-api-access-nr4c8\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.396281 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:40 crc kubenswrapper[4626]: I0223 06:55:40.775657 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq"] Feb 23 06:55:40 crc kubenswrapper[4626]: W0223 06:55:40.784284 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57abc151_5a63_4be9_a7e5_eca0e831bdb9.slice/crio-ec6c259ab3847a6aa598bc58c81bc14ca4f4a93429cc95ed070d0c714ca20107 WatchSource:0}: Error finding container ec6c259ab3847a6aa598bc58c81bc14ca4f4a93429cc95ed070d0c714ca20107: Status 404 returned error can't find the container with id ec6c259ab3847a6aa598bc58c81bc14ca4f4a93429cc95ed070d0c714ca20107 Feb 23 06:55:41 crc kubenswrapper[4626]: I0223 06:55:41.778821 4626 generic.go:334] "Generic (PLEG): container finished" podID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerID="cf74f25e56c108934411a00ad3ab45498092032ff62a05769cbec7098143b453" exitCode=0 Feb 23 06:55:41 crc kubenswrapper[4626]: I0223 06:55:41.778918 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" event={"ID":"57abc151-5a63-4be9-a7e5-eca0e831bdb9","Type":"ContainerDied","Data":"cf74f25e56c108934411a00ad3ab45498092032ff62a05769cbec7098143b453"} Feb 23 06:55:41 crc kubenswrapper[4626]: I0223 06:55:41.779206 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" event={"ID":"57abc151-5a63-4be9-a7e5-eca0e831bdb9","Type":"ContainerStarted","Data":"ec6c259ab3847a6aa598bc58c81bc14ca4f4a93429cc95ed070d0c714ca20107"} Feb 23 06:55:42 crc kubenswrapper[4626]: I0223 06:55:42.786597 4626 generic.go:334] "Generic (PLEG): container finished" podID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerID="6a3828ccccdf1821bb7f2ca15b32101d771b4fd1923fff5bfce97965217b2dd2" exitCode=0 Feb 23 06:55:42 crc kubenswrapper[4626]: I0223 06:55:42.786883 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" event={"ID":"57abc151-5a63-4be9-a7e5-eca0e831bdb9","Type":"ContainerDied","Data":"6a3828ccccdf1821bb7f2ca15b32101d771b4fd1923fff5bfce97965217b2dd2"} Feb 23 06:55:43 crc kubenswrapper[4626]: I0223 06:55:43.797095 4626 generic.go:334] "Generic (PLEG): container finished" podID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerID="b3dacf9a68f0489ef3919bc0ac0f93cd70b6745fa2ed03e2eea57c5be7ff1fee" exitCode=0 Feb 23 06:55:43 crc kubenswrapper[4626]: I0223 06:55:43.797274 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" event={"ID":"57abc151-5a63-4be9-a7e5-eca0e831bdb9","Type":"ContainerDied","Data":"b3dacf9a68f0489ef3919bc0ac0f93cd70b6745fa2ed03e2eea57c5be7ff1fee"} Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.021535 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.122476 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-bundle\") pod \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.122589 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr4c8\" (UniqueName: \"kubernetes.io/projected/57abc151-5a63-4be9-a7e5-eca0e831bdb9-kube-api-access-nr4c8\") pod \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.122621 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-util\") pod \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\" (UID: \"57abc151-5a63-4be9-a7e5-eca0e831bdb9\") " Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.123716 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-bundle" (OuterVolumeSpecName: "bundle") pod "57abc151-5a63-4be9-a7e5-eca0e831bdb9" (UID: "57abc151-5a63-4be9-a7e5-eca0e831bdb9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.129121 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57abc151-5a63-4be9-a7e5-eca0e831bdb9-kube-api-access-nr4c8" (OuterVolumeSpecName: "kube-api-access-nr4c8") pod "57abc151-5a63-4be9-a7e5-eca0e831bdb9" (UID: "57abc151-5a63-4be9-a7e5-eca0e831bdb9"). InnerVolumeSpecName "kube-api-access-nr4c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.134921 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-util" (OuterVolumeSpecName: "util") pod "57abc151-5a63-4be9-a7e5-eca0e831bdb9" (UID: "57abc151-5a63-4be9-a7e5-eca0e831bdb9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.223769 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr4c8\" (UniqueName: \"kubernetes.io/projected/57abc151-5a63-4be9-a7e5-eca0e831bdb9-kube-api-access-nr4c8\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.224145 4626 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-util\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.224224 4626 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/57abc151-5a63-4be9-a7e5-eca0e831bdb9-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.815040 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" event={"ID":"57abc151-5a63-4be9-a7e5-eca0e831bdb9","Type":"ContainerDied","Data":"ec6c259ab3847a6aa598bc58c81bc14ca4f4a93429cc95ed070d0c714ca20107"} Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.815420 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec6c259ab3847a6aa598bc58c81bc14ca4f4a93429cc95ed070d0c714ca20107" Feb 23 06:55:45 crc kubenswrapper[4626]: I0223 06:55:45.815123 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.402080 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw"] Feb 23 06:55:49 crc kubenswrapper[4626]: E0223 06:55:49.404135 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="util" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.404205 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="util" Feb 23 06:55:49 crc kubenswrapper[4626]: E0223 06:55:49.404283 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="pull" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.404331 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="pull" Feb 23 06:55:49 crc kubenswrapper[4626]: E0223 06:55:49.404379 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="extract" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.404457 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="extract" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.404716 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="57abc151-5a63-4be9-a7e5-eca0e831bdb9" containerName="extract" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.405377 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.411566 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-drrvb" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.431493 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw"] Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.595583 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mgs\" (UniqueName: \"kubernetes.io/projected/8e6ad56b-628d-40a0-b847-0e0b0040ad46-kube-api-access-c2mgs\") pod \"openstack-operator-controller-init-6679bf9b57-vqprw\" (UID: \"8e6ad56b-628d-40a0-b847-0e0b0040ad46\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.697362 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mgs\" (UniqueName: \"kubernetes.io/projected/8e6ad56b-628d-40a0-b847-0e0b0040ad46-kube-api-access-c2mgs\") pod \"openstack-operator-controller-init-6679bf9b57-vqprw\" (UID: \"8e6ad56b-628d-40a0-b847-0e0b0040ad46\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.720346 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mgs\" (UniqueName: \"kubernetes.io/projected/8e6ad56b-628d-40a0-b847-0e0b0040ad46-kube-api-access-c2mgs\") pod \"openstack-operator-controller-init-6679bf9b57-vqprw\" (UID: \"8e6ad56b-628d-40a0-b847-0e0b0040ad46\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.723924 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:55:49 crc kubenswrapper[4626]: I0223 06:55:49.942274 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw"] Feb 23 06:55:50 crc kubenswrapper[4626]: I0223 06:55:50.855153 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" event={"ID":"8e6ad56b-628d-40a0-b847-0e0b0040ad46","Type":"ContainerStarted","Data":"6c689e62636401a7bfb373347a97772622c286a1ab62f63def79f1ceaed91fca"} Feb 23 06:55:55 crc kubenswrapper[4626]: I0223 06:55:55.906207 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" event={"ID":"8e6ad56b-628d-40a0-b847-0e0b0040ad46","Type":"ContainerStarted","Data":"e6c81d7130f008ecc3e18da5eee70a2146ff6528ce74203d890c893830bad5e5"} Feb 23 06:55:55 crc kubenswrapper[4626]: I0223 06:55:55.907041 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:55:55 crc kubenswrapper[4626]: I0223 06:55:55.936511 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" podStartSLOduration=2.051989171 podStartE2EDuration="6.936469259s" podCreationTimestamp="2026-02-23 06:55:49 +0000 UTC" firstStartedPulling="2026-02-23 06:55:49.953895241 +0000 UTC m=+902.293224506" lastFinishedPulling="2026-02-23 06:55:54.838375328 +0000 UTC m=+907.177704594" observedRunningTime="2026-02-23 06:55:55.930309548 +0000 UTC m=+908.269638814" watchObservedRunningTime="2026-02-23 06:55:55.936469259 +0000 UTC m=+908.275798526" Feb 23 06:55:59 crc kubenswrapper[4626]: I0223 06:55:59.728681 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vqprw" Feb 23 06:56:17 crc kubenswrapper[4626]: I0223 06:56:17.976228 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2"] Feb 23 06:56:17 crc kubenswrapper[4626]: I0223 06:56:17.977790 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:17 crc kubenswrapper[4626]: I0223 06:56:17.993301 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-867lr"] Feb 23 06:56:17 crc kubenswrapper[4626]: I0223 06:56:17.994693 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:17 crc kubenswrapper[4626]: I0223 06:56:17.998610 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xqwlb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.007544 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.008402 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.011006 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2p8sk" Feb 23 06:56:18 crc kubenswrapper[4626]: W0223 06:56:18.012310 4626 reflector.go:561] object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bz7gt": failed to list *v1.Secret: secrets "designate-operator-controller-manager-dockercfg-bz7gt" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.012365 4626 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"designate-operator-controller-manager-dockercfg-bz7gt\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"designate-operator-controller-manager-dockercfg-bz7gt\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.030916 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.032038 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.034254 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2292p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.039624 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.051716 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-867lr"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.060944 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.083020 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.102424 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.103388 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.106881 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nw82f" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.124727 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.125470 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.128005 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-784vx" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.140381 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgwj\" (UniqueName: \"kubernetes.io/projected/805fd015-c983-4199-b12e-c0073d645e3b-kube-api-access-lmgwj\") pod \"glance-operator-controller-manager-77987464f4-6h2w7\" (UID: \"805fd015-c983-4199-b12e-c0073d645e3b\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.140517 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v54b4\" (UniqueName: \"kubernetes.io/projected/d2c9303d-be86-451a-8834-67abc679952b-kube-api-access-v54b4\") pod \"designate-operator-controller-manager-6d8bf5c495-ljvjt\" (UID: \"d2c9303d-be86-451a-8834-67abc679952b\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.140628 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgp8\" (UniqueName: \"kubernetes.io/projected/d20a75f5-68d0-4f32-bea9-62fdac3a3498-kube-api-access-5xgp8\") pod \"barbican-operator-controller-manager-868647ff47-867lr\" (UID: \"d20a75f5-68d0-4f32-bea9-62fdac3a3498\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.140768 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652rh\" (UniqueName: \"kubernetes.io/projected/4f909aac-ac8d-4ec5-8404-e9c1f77a144c-kube-api-access-652rh\") pod \"cinder-operator-controller-manager-5d946d989d-r6gc2\" (UID: \"4f909aac-ac8d-4ec5-8404-e9c1f77a144c\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.156565 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.165435 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.179665 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.180414 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.184556 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-p6dqp" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.184707 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.186797 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.187550 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.189894 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.190430 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.192986 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9t2kp" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.199591 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qbm99" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.224545 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.245393 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrjm\" (UniqueName: \"kubernetes.io/projected/984c81c1-d4bf-417c-9937-d5de29d33a00-kube-api-access-mvrjm\") pod \"heat-operator-controller-manager-69f49c598c-jlcjb\" (UID: \"984c81c1-d4bf-417c-9937-d5de29d33a00\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.246650 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgp8\" (UniqueName: \"kubernetes.io/projected/d20a75f5-68d0-4f32-bea9-62fdac3a3498-kube-api-access-5xgp8\") pod \"barbican-operator-controller-manager-868647ff47-867lr\" (UID: \"d20a75f5-68d0-4f32-bea9-62fdac3a3498\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.246709 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td2gw\" (UniqueName: \"kubernetes.io/projected/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-kube-api-access-td2gw\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.246973 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652rh\" (UniqueName: \"kubernetes.io/projected/4f909aac-ac8d-4ec5-8404-e9c1f77a144c-kube-api-access-652rh\") pod \"cinder-operator-controller-manager-5d946d989d-r6gc2\" (UID: \"4f909aac-ac8d-4ec5-8404-e9c1f77a144c\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.247052 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qclh2\" (UniqueName: \"kubernetes.io/projected/a05a1f00-3183-484d-9c35-7db986a84e8a-kube-api-access-qclh2\") pod \"ironic-operator-controller-manager-554564d7fc-qqpl6\" (UID: \"a05a1f00-3183-484d-9c35-7db986a84e8a\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.247088 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mjh\" (UniqueName: \"kubernetes.io/projected/35a1fc39-9675-4591-95a7-aa1ff016b779-kube-api-access-52mjh\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzl5c\" (UID: \"35a1fc39-9675-4591-95a7-aa1ff016b779\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.248321 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgwj\" (UniqueName: \"kubernetes.io/projected/805fd015-c983-4199-b12e-c0073d645e3b-kube-api-access-lmgwj\") pod \"glance-operator-controller-manager-77987464f4-6h2w7\" (UID: \"805fd015-c983-4199-b12e-c0073d645e3b\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.248380 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.248618 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v54b4\" (UniqueName: \"kubernetes.io/projected/d2c9303d-be86-451a-8834-67abc679952b-kube-api-access-v54b4\") pod \"designate-operator-controller-manager-6d8bf5c495-ljvjt\" (UID: \"d2c9303d-be86-451a-8834-67abc679952b\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.292026 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652rh\" (UniqueName: \"kubernetes.io/projected/4f909aac-ac8d-4ec5-8404-e9c1f77a144c-kube-api-access-652rh\") pod \"cinder-operator-controller-manager-5d946d989d-r6gc2\" (UID: \"4f909aac-ac8d-4ec5-8404-e9c1f77a144c\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.297554 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.298805 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.301213 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.311980 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgwj\" (UniqueName: \"kubernetes.io/projected/805fd015-c983-4199-b12e-c0073d645e3b-kube-api-access-lmgwj\") pod \"glance-operator-controller-manager-77987464f4-6h2w7\" (UID: \"805fd015-c983-4199-b12e-c0073d645e3b\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.318039 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgp8\" (UniqueName: \"kubernetes.io/projected/d20a75f5-68d0-4f32-bea9-62fdac3a3498-kube-api-access-5xgp8\") pod \"barbican-operator-controller-manager-868647ff47-867lr\" (UID: \"d20a75f5-68d0-4f32-bea9-62fdac3a3498\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.318658 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sj7m9" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.330794 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v54b4\" (UniqueName: \"kubernetes.io/projected/d2c9303d-be86-451a-8834-67abc679952b-kube-api-access-v54b4\") pod \"designate-operator-controller-manager-6d8bf5c495-ljvjt\" (UID: \"d2c9303d-be86-451a-8834-67abc679952b\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.351175 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qclh2\" (UniqueName: \"kubernetes.io/projected/a05a1f00-3183-484d-9c35-7db986a84e8a-kube-api-access-qclh2\") pod \"ironic-operator-controller-manager-554564d7fc-qqpl6\" (UID: \"a05a1f00-3183-484d-9c35-7db986a84e8a\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.351287 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52mjh\" (UniqueName: \"kubernetes.io/projected/35a1fc39-9675-4591-95a7-aa1ff016b779-kube-api-access-52mjh\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzl5c\" (UID: \"35a1fc39-9675-4591-95a7-aa1ff016b779\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.351372 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/682961f6-ea8c-4883-93e3-af65115c9507-kube-api-access-qjjmr\") pod \"keystone-operator-controller-manager-b4d948c87-zsrhb\" (UID: \"682961f6-ea8c-4883-93e3-af65115c9507\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.351464 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.351599 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrjm\" (UniqueName: \"kubernetes.io/projected/984c81c1-d4bf-417c-9937-d5de29d33a00-kube-api-access-mvrjm\") pod \"heat-operator-controller-manager-69f49c598c-jlcjb\" (UID: \"984c81c1-d4bf-417c-9937-d5de29d33a00\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.351709 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td2gw\" (UniqueName: \"kubernetes.io/projected/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-kube-api-access-td2gw\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.352334 4626 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.352452 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert podName:47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:18.852426825 +0000 UTC m=+931.191756091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert") pod "infra-operator-controller-manager-79d975b745-6qn7p" (UID: "47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5") : secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.358732 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.383646 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td2gw\" (UniqueName: \"kubernetes.io/projected/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-kube-api-access-td2gw\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.383810 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.384351 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrjm\" (UniqueName: \"kubernetes.io/projected/984c81c1-d4bf-417c-9937-d5de29d33a00-kube-api-access-mvrjm\") pod \"heat-operator-controller-manager-69f49c598c-jlcjb\" (UID: \"984c81c1-d4bf-417c-9937-d5de29d33a00\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.385236 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.383959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mjh\" (UniqueName: \"kubernetes.io/projected/35a1fc39-9675-4591-95a7-aa1ff016b779-kube-api-access-52mjh\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzl5c\" (UID: \"35a1fc39-9675-4591-95a7-aa1ff016b779\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.395842 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xf2qj" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.396481 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qclh2\" (UniqueName: \"kubernetes.io/projected/a05a1f00-3183-484d-9c35-7db986a84e8a-kube-api-access-qclh2\") pod \"ironic-operator-controller-manager-554564d7fc-qqpl6\" (UID: \"a05a1f00-3183-484d-9c35-7db986a84e8a\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.417552 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.428129 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.440535 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.441690 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.454227 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6c79\" (UniqueName: \"kubernetes.io/projected/8c4d609a-2d99-44b9-9c86-e20a3965381b-kube-api-access-n6c79\") pod \"manila-operator-controller-manager-54f6768c69-g6gzq\" (UID: \"8c4d609a-2d99-44b9-9c86-e20a3965381b\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.454891 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/682961f6-ea8c-4883-93e3-af65115c9507-kube-api-access-qjjmr\") pod \"keystone-operator-controller-manager-b4d948c87-zsrhb\" (UID: \"682961f6-ea8c-4883-93e3-af65115c9507\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.469809 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.479953 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.480579 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjmr\" (UniqueName: \"kubernetes.io/projected/682961f6-ea8c-4883-93e3-af65115c9507-kube-api-access-qjjmr\") pod \"keystone-operator-controller-manager-b4d948c87-zsrhb\" (UID: \"682961f6-ea8c-4883-93e3-af65115c9507\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.480859 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.486273 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pks24" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.498099 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.513541 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.533851 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.546638 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.556953 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.557821 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.559859 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6tjj\" (UniqueName: \"kubernetes.io/projected/df31bf79-feee-4644-a1eb-bd6d5af05d7f-kube-api-access-q6tjj\") pod \"nova-operator-controller-manager-567668f5cf-8nqw2\" (UID: \"df31bf79-feee-4644-a1eb-bd6d5af05d7f\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.559896 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdblw\" (UniqueName: \"kubernetes.io/projected/98f392cd-76ee-4062-8ad1-15608b3658dc-kube-api-access-bdblw\") pod \"neutron-operator-controller-manager-64ddbf8bb-mnbrh\" (UID: \"98f392cd-76ee-4062-8ad1-15608b3658dc\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.559932 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs757\" (UniqueName: \"kubernetes.io/projected/cfc5b69c-d190-4f73-a311-c9a371762530-kube-api-access-vs757\") pod \"mariadb-operator-controller-manager-6994f66f48-j524z\" (UID: \"cfc5b69c-d190-4f73-a311-c9a371762530\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.559975 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6c79\" (UniqueName: \"kubernetes.io/projected/8c4d609a-2d99-44b9-9c86-e20a3965381b-kube-api-access-n6c79\") pod \"manila-operator-controller-manager-54f6768c69-g6gzq\" (UID: \"8c4d609a-2d99-44b9-9c86-e20a3965381b\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.567944 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kj8h8" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.579418 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.594022 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.594903 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.611859 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-j2w7l" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.611968 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.612880 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.613700 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6c79\" (UniqueName: \"kubernetes.io/projected/8c4d609a-2d99-44b9-9c86-e20a3965381b-kube-api-access-n6c79\") pod \"manila-operator-controller-manager-54f6768c69-g6gzq\" (UID: \"8c4d609a-2d99-44b9-9c86-e20a3965381b\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.619012 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.623259 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sxssk" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.623827 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.624425 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.632712 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.634508 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.641683 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.652963 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.635539 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-96hsx" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.662966 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6tjj\" (UniqueName: \"kubernetes.io/projected/df31bf79-feee-4644-a1eb-bd6d5af05d7f-kube-api-access-q6tjj\") pod \"nova-operator-controller-manager-567668f5cf-8nqw2\" (UID: \"df31bf79-feee-4644-a1eb-bd6d5af05d7f\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.663001 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5j2q\" (UniqueName: \"kubernetes.io/projected/d8a75041-f9ba-4691-9467-f20f9205daa6-kube-api-access-v5j2q\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.663020 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdblw\" (UniqueName: \"kubernetes.io/projected/98f392cd-76ee-4062-8ad1-15608b3658dc-kube-api-access-bdblw\") pod \"neutron-operator-controller-manager-64ddbf8bb-mnbrh\" (UID: \"98f392cd-76ee-4062-8ad1-15608b3658dc\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.663060 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs757\" (UniqueName: \"kubernetes.io/projected/cfc5b69c-d190-4f73-a311-c9a371762530-kube-api-access-vs757\") pod \"mariadb-operator-controller-manager-6994f66f48-j524z\" (UID: \"cfc5b69c-d190-4f73-a311-c9a371762530\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.663086 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42777\" (UniqueName: \"kubernetes.io/projected/8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a-kube-api-access-42777\") pod \"ovn-operator-controller-manager-d44cf6b75-vxbs9\" (UID: \"8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.663110 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm445\" (UniqueName: \"kubernetes.io/projected/7e5b7475-eb3d-4c76-955d-0d9948cf2fe7-kube-api-access-lm445\") pod \"octavia-operator-controller-manager-69f8888797-j9cqz\" (UID: \"7e5b7475-eb3d-4c76-955d-0d9948cf2fe7\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.663138 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.673551 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-hv75x"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.674344 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-db9df"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.674888 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.675294 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.686872 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-hv75x"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.701610 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-tzh2l"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.702773 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.703283 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wz9d6" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.703461 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.703870 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kh7vf" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.704096 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.711817 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-tzh2l"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.721883 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-db9df"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.723435 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.731867 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.752057 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wxfw5" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.752481 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qfdnp" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.754885 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs757\" (UniqueName: \"kubernetes.io/projected/cfc5b69c-d190-4f73-a311-c9a371762530-kube-api-access-vs757\") pod \"mariadb-operator-controller-manager-6994f66f48-j524z\" (UID: \"cfc5b69c-d190-4f73-a311-c9a371762530\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.761030 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdblw\" (UniqueName: \"kubernetes.io/projected/98f392cd-76ee-4062-8ad1-15608b3658dc-kube-api-access-bdblw\") pod \"neutron-operator-controller-manager-64ddbf8bb-mnbrh\" (UID: \"98f392cd-76ee-4062-8ad1-15608b3658dc\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.764108 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6tjj\" (UniqueName: \"kubernetes.io/projected/df31bf79-feee-4644-a1eb-bd6d5af05d7f-kube-api-access-q6tjj\") pod \"nova-operator-controller-manager-567668f5cf-8nqw2\" (UID: \"df31bf79-feee-4644-a1eb-bd6d5af05d7f\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.764744 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.765032 4626 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.765857 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert podName:d8a75041-f9ba-4691-9467-f20f9205daa6 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:19.265842091 +0000 UTC m=+931.605171347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" (UID: "d8a75041-f9ba-4691-9467-f20f9205daa6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.765629 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5j2q\" (UniqueName: \"kubernetes.io/projected/d8a75041-f9ba-4691-9467-f20f9205daa6-kube-api-access-v5j2q\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.766220 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42777\" (UniqueName: \"kubernetes.io/projected/8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a-kube-api-access-42777\") pod \"ovn-operator-controller-manager-d44cf6b75-vxbs9\" (UID: \"8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.766301 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm445\" (UniqueName: \"kubernetes.io/projected/7e5b7475-eb3d-4c76-955d-0d9948cf2fe7-kube-api-access-lm445\") pod \"octavia-operator-controller-manager-69f8888797-j9cqz\" (UID: \"7e5b7475-eb3d-4c76-955d-0d9948cf2fe7\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.784331 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5j2q\" (UniqueName: \"kubernetes.io/projected/d8a75041-f9ba-4691-9467-f20f9205daa6-kube-api-access-v5j2q\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.794983 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.824364 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm445\" (UniqueName: \"kubernetes.io/projected/7e5b7475-eb3d-4c76-955d-0d9948cf2fe7-kube-api-access-lm445\") pod \"octavia-operator-controller-manager-69f8888797-j9cqz\" (UID: \"7e5b7475-eb3d-4c76-955d-0d9948cf2fe7\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.833904 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42777\" (UniqueName: \"kubernetes.io/projected/8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a-kube-api-access-42777\") pod \"ovn-operator-controller-manager-d44cf6b75-vxbs9\" (UID: \"8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.854079 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.859687 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.864322 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84"] Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.864531 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-kxgmb" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.867971 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.868238 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsjs\" (UniqueName: \"kubernetes.io/projected/7f639fb4-3160-4bd7-ab2a-86ea80cb51ed-kube-api-access-nqsjs\") pod \"swift-operator-controller-manager-68f46476f-hv75x\" (UID: \"7f639fb4-3160-4bd7-ab2a-86ea80cb51ed\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.868288 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4jrv\" (UniqueName: \"kubernetes.io/projected/6f3e393a-a659-44ae-bad8-6e4ff2d649ce-kube-api-access-k4jrv\") pod \"telemetry-operator-controller-manager-7f45b4ff68-vdx4s\" (UID: \"6f3e393a-a659-44ae-bad8-6e4ff2d649ce\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.868322 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9p67\" (UniqueName: \"kubernetes.io/projected/d9ef444e-448b-4576-960b-5861b7c19720-kube-api-access-c9p67\") pod \"test-operator-controller-manager-7866795846-tzh2l\" (UID: \"d9ef444e-448b-4576-960b-5861b7c19720\") " pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.868367 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgwkd\" (UniqueName: \"kubernetes.io/projected/1867cac1-c043-4677-9c64-786b1f261fd5-kube-api-access-cgwkd\") pod \"placement-operator-controller-manager-8497b45c89-db9df\" (UID: \"1867cac1-c043-4677-9c64-786b1f261fd5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.868468 4626 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:18 crc kubenswrapper[4626]: E0223 06:56:18.868548 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert podName:47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:19.868523622 +0000 UTC m=+932.207852889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert") pod "infra-operator-controller-manager-79d975b745-6qn7p" (UID: "47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5") : secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.915346 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.952963 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.963431 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.970434 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqpkw\" (UniqueName: \"kubernetes.io/projected/cbbbb529-024e-4f9e-ad1c-063c63f39324-kube-api-access-tqpkw\") pod \"watcher-operator-controller-manager-5db88f68c-qfd84\" (UID: \"cbbbb529-024e-4f9e-ad1c-063c63f39324\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.970559 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsjs\" (UniqueName: \"kubernetes.io/projected/7f639fb4-3160-4bd7-ab2a-86ea80cb51ed-kube-api-access-nqsjs\") pod \"swift-operator-controller-manager-68f46476f-hv75x\" (UID: \"7f639fb4-3160-4bd7-ab2a-86ea80cb51ed\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.970584 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4jrv\" (UniqueName: \"kubernetes.io/projected/6f3e393a-a659-44ae-bad8-6e4ff2d649ce-kube-api-access-k4jrv\") pod \"telemetry-operator-controller-manager-7f45b4ff68-vdx4s\" (UID: \"6f3e393a-a659-44ae-bad8-6e4ff2d649ce\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.970608 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9p67\" (UniqueName: \"kubernetes.io/projected/d9ef444e-448b-4576-960b-5861b7c19720-kube-api-access-c9p67\") pod \"test-operator-controller-manager-7866795846-tzh2l\" (UID: \"d9ef444e-448b-4576-960b-5861b7c19720\") " pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:18 crc kubenswrapper[4626]: I0223 06:56:18.970641 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgwkd\" (UniqueName: \"kubernetes.io/projected/1867cac1-c043-4677-9c64-786b1f261fd5-kube-api-access-cgwkd\") pod \"placement-operator-controller-manager-8497b45c89-db9df\" (UID: \"1867cac1-c043-4677-9c64-786b1f261fd5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:18.992111 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:18.993089 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.011371 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgwkd\" (UniqueName: \"kubernetes.io/projected/1867cac1-c043-4677-9c64-786b1f261fd5-kube-api-access-cgwkd\") pod \"placement-operator-controller-manager-8497b45c89-db9df\" (UID: \"1867cac1-c043-4677-9c64-786b1f261fd5\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.016225 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.021742 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4jrv\" (UniqueName: \"kubernetes.io/projected/6f3e393a-a659-44ae-bad8-6e4ff2d649ce-kube-api-access-k4jrv\") pod \"telemetry-operator-controller-manager-7f45b4ff68-vdx4s\" (UID: \"6f3e393a-a659-44ae-bad8-6e4ff2d649ce\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.025824 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qm287" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.025977 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.026092 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.038771 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsjs\" (UniqueName: \"kubernetes.io/projected/7f639fb4-3160-4bd7-ab2a-86ea80cb51ed-kube-api-access-nqsjs\") pod \"swift-operator-controller-manager-68f46476f-hv75x\" (UID: \"7f639fb4-3160-4bd7-ab2a-86ea80cb51ed\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.043147 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.050398 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9p67\" (UniqueName: \"kubernetes.io/projected/d9ef444e-448b-4576-960b-5861b7c19720-kube-api-access-c9p67\") pod \"test-operator-controller-manager-7866795846-tzh2l\" (UID: \"d9ef444e-448b-4576-960b-5861b7c19720\") " pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.071021 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.072631 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqpkw\" (UniqueName: \"kubernetes.io/projected/cbbbb529-024e-4f9e-ad1c-063c63f39324-kube-api-access-tqpkw\") pod \"watcher-operator-controller-manager-5db88f68c-qfd84\" (UID: \"cbbbb529-024e-4f9e-ad1c-063c63f39324\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.084720 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.085572 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.085847 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.087422 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w58f6" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.105412 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.107001 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqpkw\" (UniqueName: \"kubernetes.io/projected/cbbbb529-024e-4f9e-ad1c-063c63f39324-kube-api-access-tqpkw\") pod \"watcher-operator-controller-manager-5db88f68c-qfd84\" (UID: \"cbbbb529-024e-4f9e-ad1c-063c63f39324\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.147437 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.158299 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.175878 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh9gh\" (UniqueName: \"kubernetes.io/projected/02bcecce-5716-45c6-8d40-f1da91d26673-kube-api-access-dh9gh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vzwhb\" (UID: \"02bcecce-5716-45c6-8d40-f1da91d26673\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.175930 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvts2\" (UniqueName: \"kubernetes.io/projected/fab598e9-890d-4f24-b26b-8f5b507a86c8-kube-api-access-qvts2\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.175984 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.176018 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.189865 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.209451 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bz7gt" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.216998 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.276611 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh9gh\" (UniqueName: \"kubernetes.io/projected/02bcecce-5716-45c6-8d40-f1da91d26673-kube-api-access-dh9gh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vzwhb\" (UID: \"02bcecce-5716-45c6-8d40-f1da91d26673\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.276666 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvts2\" (UniqueName: \"kubernetes.io/projected/fab598e9-890d-4f24-b26b-8f5b507a86c8-kube-api-access-qvts2\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.276701 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.276737 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.276763 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.276919 4626 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.276965 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:19.776952588 +0000 UTC m=+932.116281845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "metrics-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.277349 4626 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.277381 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:19.777374033 +0000 UTC m=+932.116703299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.277455 4626 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.277540 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert podName:d8a75041-f9ba-4691-9467-f20f9205daa6 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:20.277519628 +0000 UTC m=+932.616848894 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" (UID: "d8a75041-f9ba-4691-9467-f20f9205daa6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.305316 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh9gh\" (UniqueName: \"kubernetes.io/projected/02bcecce-5716-45c6-8d40-f1da91d26673-kube-api-access-dh9gh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vzwhb\" (UID: \"02bcecce-5716-45c6-8d40-f1da91d26673\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.306720 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvts2\" (UniqueName: \"kubernetes.io/projected/fab598e9-890d-4f24-b26b-8f5b507a86c8-kube-api-access-qvts2\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.433983 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.440011 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.461644 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.787584 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.787807 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.787803 4626 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.787914 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:20.787892245 +0000 UTC m=+933.127221501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "metrics-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.787954 4626 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.788033 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:20.788009285 +0000 UTC m=+933.127338552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.890047 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.890225 4626 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: E0223 06:56:19.890609 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert podName:47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:21.890581421 +0000 UTC m=+934.229910677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert") pod "infra-operator-controller-manager-79d975b745-6qn7p" (UID: "47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5") : secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.964056 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.969631 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.986315 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-db9df"] Feb 23 06:56:19 crc kubenswrapper[4626]: I0223 06:56:19.999677 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.104314 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" event={"ID":"a05a1f00-3183-484d-9c35-7db986a84e8a","Type":"ContainerStarted","Data":"fdb4d341eaf513481c36f03a9c04c34786b40cc8d9b14968b219d28153a7dfdf"} Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.107012 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" event={"ID":"df31bf79-feee-4644-a1eb-bd6d5af05d7f","Type":"ContainerStarted","Data":"5f85b6bef2b16434f325713244ce35d3f9164cc9b4b49ffa4221ec4175ea1f0f"} Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.108023 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" event={"ID":"805fd015-c983-4199-b12e-c0073d645e3b","Type":"ContainerStarted","Data":"209981e3c13c9bd9c6b66c7096bfaac765c32b1e687b437dab5b69da790704d3"} Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.109811 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" event={"ID":"4f909aac-ac8d-4ec5-8404-e9c1f77a144c","Type":"ContainerStarted","Data":"5c1a3d2b440d6c834a8bf1c244de207a45a3c34aa4c6026a0d6cfa3df89aee34"} Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.111180 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" event={"ID":"1867cac1-c043-4677-9c64-786b1f261fd5","Type":"ContainerStarted","Data":"0186922bb7e60fe45d8cf2915d66d4895907e29accb0461c3e0ac7cdc23cd168"} Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.112405 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" event={"ID":"98f392cd-76ee-4062-8ad1-15608b3658dc","Type":"ContainerStarted","Data":"e4b232ec8065e7366509bb982ef865194d37e7b987094fc0766dd1efa9d3cb58"} Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.125871 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb"] Feb 23 06:56:20 crc kubenswrapper[4626]: W0223 06:56:20.169183 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c4d609a_2d99_44b9_9c86_e20a3965381b.slice/crio-3825460e73406b66c078d1ed27a27e91f1f88f34f8ff945787fccdced41b43c3 WatchSource:0}: Error finding container 3825460e73406b66c078d1ed27a27e91f1f88f34f8ff945787fccdced41b43c3: Status 404 returned error can't find the container with id 3825460e73406b66c078d1ed27a27e91f1f88f34f8ff945787fccdced41b43c3 Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.170302 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-867lr"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.183867 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq"] Feb 23 06:56:20 crc kubenswrapper[4626]: W0223 06:56:20.200816 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod984c81c1_d4bf_417c_9937_d5de29d33a00.slice/crio-aafa8a5648a02ea002fbe29b47543ea908759b1ca8561611d849dce12ce26d3a WatchSource:0}: Error finding container aafa8a5648a02ea002fbe29b47543ea908759b1ca8561611d849dce12ce26d3a: Status 404 returned error can't find the container with id aafa8a5648a02ea002fbe29b47543ea908759b1ca8561611d849dce12ce26d3a Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.201738 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb"] Feb 23 06:56:20 crc kubenswrapper[4626]: W0223 06:56:20.207316 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e5b7475_eb3d_4c76_955d_0d9948cf2fe7.slice/crio-577caa59feeb3c360fd16911bafef5916cccf07282fe1bba4c48a954b186ec5a WatchSource:0}: Error finding container 577caa59feeb3c360fd16911bafef5916cccf07282fe1bba4c48a954b186ec5a: Status 404 returned error can't find the container with id 577caa59feeb3c360fd16911bafef5916cccf07282fe1bba4c48a954b186ec5a Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.211099 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.217208 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.228918 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.235276 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-tzh2l"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.251393 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-hv75x"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.268689 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.273317 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84"] Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.282838 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c9p67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-tzh2l_openstack-operators(d9ef444e-448b-4576-960b-5861b7c19720): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.284890 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" podUID="d9ef444e-448b-4576-960b-5861b7c19720" Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.288359 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb"] Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.293551 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dh9gh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vzwhb_openstack-operators(02bcecce-5716-45c6-8d40-f1da91d26673): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.295036 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" podUID="02bcecce-5716-45c6-8d40-f1da91d26673" Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.298416 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s"] Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.308149 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.308478 4626 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.308648 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert podName:d8a75041-f9ba-4691-9467-f20f9205daa6 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:22.308621966 +0000 UTC m=+934.647951222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" (UID: "d8a75041-f9ba-4691-9467-f20f9205daa6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.309076 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z"] Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.322722 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqpkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-qfd84_openstack-operators(cbbbb529-024e-4f9e-ad1c-063c63f39324): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.323825 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" podUID="cbbbb529-024e-4f9e-ad1c-063c63f39324" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.324425 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nqsjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-hv75x_openstack-operators(7f639fb4-3160-4bd7-ab2a-86ea80cb51ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.325362 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k4jrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-vdx4s_openstack-operators(6f3e393a-a659-44ae-bad8-6e4ff2d649ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.326649 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" podUID="6f3e393a-a659-44ae-bad8-6e4ff2d649ce" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.326775 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" podUID="7f639fb4-3160-4bd7-ab2a-86ea80cb51ed" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.329973 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-42777,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-vxbs9_openstack-operators(8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.330231 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vs757,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-j524z_openstack-operators(cfc5b69c-d190-4f73-a311-c9a371762530): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.332034 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" podUID="cfc5b69c-d190-4f73-a311-c9a371762530" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.332077 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" podUID="8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a" Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.816598 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:20 crc kubenswrapper[4626]: I0223 06:56:20.816700 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.816906 4626 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.816985 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:22.816959818 +0000 UTC m=+935.156289083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "metrics-server-cert" not found Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.817402 4626 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 06:56:20 crc kubenswrapper[4626]: E0223 06:56:20.817510 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:22.817471161 +0000 UTC m=+935.156800427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "webhook-server-cert" not found Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.138294 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" event={"ID":"8c4d609a-2d99-44b9-9c86-e20a3965381b","Type":"ContainerStarted","Data":"3825460e73406b66c078d1ed27a27e91f1f88f34f8ff945787fccdced41b43c3"} Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.148538 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" event={"ID":"7e5b7475-eb3d-4c76-955d-0d9948cf2fe7","Type":"ContainerStarted","Data":"577caa59feeb3c360fd16911bafef5916cccf07282fe1bba4c48a954b186ec5a"} Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.153016 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" event={"ID":"7f639fb4-3160-4bd7-ab2a-86ea80cb51ed","Type":"ContainerStarted","Data":"ac61695103895b40434337095f5aafd84543503863d80b0312db12bafaac8abb"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.155458 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" podUID="7f639fb4-3160-4bd7-ab2a-86ea80cb51ed" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.158673 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" event={"ID":"984c81c1-d4bf-417c-9937-d5de29d33a00","Type":"ContainerStarted","Data":"aafa8a5648a02ea002fbe29b47543ea908759b1ca8561611d849dce12ce26d3a"} Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.162571 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" event={"ID":"d9ef444e-448b-4576-960b-5861b7c19720","Type":"ContainerStarted","Data":"a790abe9456d4ee1a4dd1bf43bf382f60af2be2965b952b0089fa75f61d3e42d"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.173641 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" podUID="d9ef444e-448b-4576-960b-5861b7c19720" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.178145 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" event={"ID":"02bcecce-5716-45c6-8d40-f1da91d26673","Type":"ContainerStarted","Data":"d985adfa7e25f8fd8d3ff90d71f57b126426bbd858083f2c6f9179a074f70b55"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.183368 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" podUID="02bcecce-5716-45c6-8d40-f1da91d26673" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.194208 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" event={"ID":"cbbbb529-024e-4f9e-ad1c-063c63f39324","Type":"ContainerStarted","Data":"a813c75add4db872c449e86ab98c53eeab79310924c29e2e40d4906b9e8541d0"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.195903 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" podUID="cbbbb529-024e-4f9e-ad1c-063c63f39324" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.219769 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" event={"ID":"cfc5b69c-d190-4f73-a311-c9a371762530","Type":"ContainerStarted","Data":"81979f3d688a27fad8c60e16009d4e07499def2f56ff56149a9c8af166a7cc2a"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.225644 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" podUID="cfc5b69c-d190-4f73-a311-c9a371762530" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.233395 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" event={"ID":"d20a75f5-68d0-4f32-bea9-62fdac3a3498","Type":"ContainerStarted","Data":"d10276c272978afd50bc8e92195270fd1837be60578a228fa3f1aa3deead054b"} Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.239520 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" event={"ID":"8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a","Type":"ContainerStarted","Data":"8e0a2014a50abbaa42ad96a724a21b89f317751c281690b09b1a08a92f49ed16"} Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.255139 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" event={"ID":"d2c9303d-be86-451a-8834-67abc679952b","Type":"ContainerStarted","Data":"18ca9b00f6c5056df5aefb33655fce20e404b7b1110104f1a0eea7bff08307eb"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.257061 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" podUID="8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.257793 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" event={"ID":"6f3e393a-a659-44ae-bad8-6e4ff2d649ce","Type":"ContainerStarted","Data":"3a98b6ada1accb5d07fdc037f52ccf78e761dfe098d39d3e7e40d9028ab57e27"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.268627 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" podUID="6f3e393a-a659-44ae-bad8-6e4ff2d649ce" Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.274476 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" event={"ID":"35a1fc39-9675-4591-95a7-aa1ff016b779","Type":"ContainerStarted","Data":"dbc35a1ae275b8ae00312b1e8d3fcdf8af678bb36cb03f6f0f451b6f6834e361"} Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.281094 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" event={"ID":"682961f6-ea8c-4883-93e3-af65115c9507","Type":"ContainerStarted","Data":"24450b05069c7420b93fb91b82ecdcfdf5ddbe3acb73ac7132a32ea3c488074e"} Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.931944 4626 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:21 crc kubenswrapper[4626]: E0223 06:56:21.932033 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert podName:47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:25.932015483 +0000 UTC m=+938.271344749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert") pod "infra-operator-controller-manager-79d975b745-6qn7p" (UID: "47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5") : secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:21 crc kubenswrapper[4626]: I0223 06:56:21.933163 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.293005 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" podUID="cfc5b69c-d190-4f73-a311-c9a371762530" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.294391 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" podUID="cbbbb529-024e-4f9e-ad1c-063c63f39324" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.294919 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" podUID="d9ef444e-448b-4576-960b-5861b7c19720" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.295378 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" podUID="02bcecce-5716-45c6-8d40-f1da91d26673" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.295921 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" podUID="7f639fb4-3160-4bd7-ab2a-86ea80cb51ed" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.296039 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" podUID="8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.296107 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" podUID="6f3e393a-a659-44ae-bad8-6e4ff2d649ce" Feb 23 06:56:22 crc kubenswrapper[4626]: I0223 06:56:22.344818 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.345804 4626 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.345878 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert podName:d8a75041-f9ba-4691-9467-f20f9205daa6 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:26.34585582 +0000 UTC m=+938.685185086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" (UID: "d8a75041-f9ba-4691-9467-f20f9205daa6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:22 crc kubenswrapper[4626]: I0223 06:56:22.852859 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:22 crc kubenswrapper[4626]: I0223 06:56:22.852945 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.853544 4626 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.853649 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:26.853623147 +0000 UTC m=+939.192952412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "webhook-server-cert" not found Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.855981 4626 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 06:56:22 crc kubenswrapper[4626]: E0223 06:56:22.856103 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:26.856078775 +0000 UTC m=+939.195408041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "metrics-server-cert" not found Feb 23 06:56:25 crc kubenswrapper[4626]: I0223 06:56:25.685211 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:56:25 crc kubenswrapper[4626]: I0223 06:56:25.685538 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:56:26 crc kubenswrapper[4626]: I0223 06:56:26.013732 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.013944 4626 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.014259 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert podName:47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:34.014235094 +0000 UTC m=+946.353564361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert") pod "infra-operator-controller-manager-79d975b745-6qn7p" (UID: "47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5") : secret "infra-operator-webhook-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: I0223 06:56:26.421190 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.421473 4626 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.421640 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert podName:d8a75041-f9ba-4691-9467-f20f9205daa6 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:34.421569649 +0000 UTC m=+946.760898914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" (UID: "d8a75041-f9ba-4691-9467-f20f9205daa6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: I0223 06:56:26.935979 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:26 crc kubenswrapper[4626]: I0223 06:56:26.936067 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.936185 4626 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.936282 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:34.93625773 +0000 UTC m=+947.275586986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "webhook-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.936332 4626 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 06:56:26 crc kubenswrapper[4626]: E0223 06:56:26.936438 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:34.936415117 +0000 UTC m=+947.275744383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "metrics-server-cert" not found Feb 23 06:56:33 crc kubenswrapper[4626]: E0223 06:56:33.395798 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 23 06:56:33 crc kubenswrapper[4626]: E0223 06:56:33.396895 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdblw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-mnbrh_openstack-operators(98f392cd-76ee-4062-8ad1-15608b3658dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:56:33 crc kubenswrapper[4626]: E0223 06:56:33.402711 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" podUID="98f392cd-76ee-4062-8ad1-15608b3658dc" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.049572 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.058217 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5-cert\") pod \"infra-operator-controller-manager-79d975b745-6qn7p\" (UID: \"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.094235 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:34 crc kubenswrapper[4626]: E0223 06:56:34.396958 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" podUID="98f392cd-76ee-4062-8ad1-15608b3658dc" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.463665 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.474114 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8a75041-f9ba-4691-9467-f20f9205daa6-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75\" (UID: \"d8a75041-f9ba-4691-9467-f20f9205daa6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.605687 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.969776 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.970072 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:34 crc kubenswrapper[4626]: E0223 06:56:34.969987 4626 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 06:56:34 crc kubenswrapper[4626]: E0223 06:56:34.970180 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs podName:fab598e9-890d-4f24-b26b-8f5b507a86c8 nodeName:}" failed. No retries permitted until 2026-02-23 06:56:50.970162217 +0000 UTC m=+963.309491483 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-mf52c" (UID: "fab598e9-890d-4f24-b26b-8f5b507a86c8") : secret "webhook-server-cert" not found Feb 23 06:56:34 crc kubenswrapper[4626]: I0223 06:56:34.990326 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.405911 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p"] Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.451886 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" event={"ID":"d2c9303d-be86-451a-8834-67abc679952b","Type":"ContainerStarted","Data":"880ad506ed0fb525fdb338ad74374d942d75f6164ebc53dc22cb59378de07b1c"} Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.452115 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.453887 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" event={"ID":"1867cac1-c043-4677-9c64-786b1f261fd5","Type":"ContainerStarted","Data":"f56e46e5843af41a2b3d372a45abde15a848a2a7a097cf5b1cd885358cdbd113"} Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.454300 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.464042 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.473386 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" event={"ID":"a05a1f00-3183-484d-9c35-7db986a84e8a","Type":"ContainerStarted","Data":"2eb3e68ddff648e9a59e5a225b51ea4fb0566ef5f5bd29688c7f1090a5e114f1"} Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.474079 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.483303 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" event={"ID":"4f909aac-ac8d-4ec5-8404-e9c1f77a144c","Type":"ContainerStarted","Data":"6f346f5c02c89a62a9a9fbb921904d094c11a9d93239cdb28b10b2851a8a29f8"} Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.483674 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.493891 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" podStartSLOduration=4.426410274 podStartE2EDuration="19.493878496s" podCreationTimestamp="2026-02-23 06:56:17 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.245320137 +0000 UTC m=+932.584649404" lastFinishedPulling="2026-02-23 06:56:35.312788361 +0000 UTC m=+947.652117626" observedRunningTime="2026-02-23 06:56:36.492658977 +0000 UTC m=+948.831988242" watchObservedRunningTime="2026-02-23 06:56:36.493878496 +0000 UTC m=+948.833207762" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.518272 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" podStartSLOduration=3.721031385 podStartE2EDuration="19.518256098s" podCreationTimestamp="2026-02-23 06:56:17 +0000 UTC" firstStartedPulling="2026-02-23 06:56:19.5160532 +0000 UTC m=+931.855382466" lastFinishedPulling="2026-02-23 06:56:35.313277912 +0000 UTC m=+947.652607179" observedRunningTime="2026-02-23 06:56:36.51675008 +0000 UTC m=+948.856079347" watchObservedRunningTime="2026-02-23 06:56:36.518256098 +0000 UTC m=+948.857585365" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.562873 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75"] Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.578315 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" podStartSLOduration=2.852146406 podStartE2EDuration="18.578295247s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.219291121 +0000 UTC m=+932.558620387" lastFinishedPulling="2026-02-23 06:56:35.945439962 +0000 UTC m=+948.284769228" observedRunningTime="2026-02-23 06:56:36.549213375 +0000 UTC m=+948.888542641" watchObservedRunningTime="2026-02-23 06:56:36.578295247 +0000 UTC m=+948.917624512" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.598757 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" podStartSLOduration=4.02359678 podStartE2EDuration="18.598739596s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:19.982203074 +0000 UTC m=+932.321532340" lastFinishedPulling="2026-02-23 06:56:34.55734589 +0000 UTC m=+946.896675156" observedRunningTime="2026-02-23 06:56:36.597746463 +0000 UTC m=+948.937075719" watchObservedRunningTime="2026-02-23 06:56:36.598739596 +0000 UTC m=+948.938068862" Feb 23 06:56:36 crc kubenswrapper[4626]: I0223 06:56:36.649640 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" podStartSLOduration=3.321100236 podStartE2EDuration="18.649620029s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:19.984260843 +0000 UTC m=+932.323590109" lastFinishedPulling="2026-02-23 06:56:35.312780636 +0000 UTC m=+947.652109902" observedRunningTime="2026-02-23 06:56:36.643860261 +0000 UTC m=+948.983189528" watchObservedRunningTime="2026-02-23 06:56:36.649620029 +0000 UTC m=+948.988949295" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.503968 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" event={"ID":"35a1fc39-9675-4591-95a7-aa1ff016b779","Type":"ContainerStarted","Data":"cd8011e5e136ccd853d60ceaaa4023a64afb9ca34ec9a348871b0083ade5299a"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.505363 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" event={"ID":"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5","Type":"ContainerStarted","Data":"9834121cd5e92fe0d0d1ef594b10e1e5de70a6f50665775836d17dd04b83d25a"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.519430 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" event={"ID":"984c81c1-d4bf-417c-9937-d5de29d33a00","Type":"ContainerStarted","Data":"d1210215f8d92a3556dd07359eb79a9bee654c36f0eecd3e0d63317e150168e8"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.519593 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.534102 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" event={"ID":"682961f6-ea8c-4883-93e3-af65115c9507","Type":"ContainerStarted","Data":"32317f4a4b1ed4c3a5a9f482d4a7cf1ec6752e3a62a0d34037238d4b09265762"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.534609 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.545780 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" podStartSLOduration=4.398515327 podStartE2EDuration="19.545769331s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.207303994 +0000 UTC m=+932.546633261" lastFinishedPulling="2026-02-23 06:56:35.354557999 +0000 UTC m=+947.693887265" observedRunningTime="2026-02-23 06:56:37.540835631 +0000 UTC m=+949.880164897" watchObservedRunningTime="2026-02-23 06:56:37.545769331 +0000 UTC m=+949.885098597" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.550667 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" event={"ID":"cbbbb529-024e-4f9e-ad1c-063c63f39324","Type":"ContainerStarted","Data":"03dd243462845fe32d04b303d2f201593c9e3d8122924999e4d4e3651296a3b2"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.551372 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.568901 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" event={"ID":"df31bf79-feee-4644-a1eb-bd6d5af05d7f","Type":"ContainerStarted","Data":"87e9cf2bca6db2fc10ec6111a8492a702f03290771183e619f8d9563ddf86881"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.569590 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.591628 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" event={"ID":"805fd015-c983-4199-b12e-c0073d645e3b","Type":"ContainerStarted","Data":"1e9a967c57ab9a6e170b7c8dce04fea4e56b64bd146086ad605092bc02227bc8"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.592414 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.598199 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" event={"ID":"d20a75f5-68d0-4f32-bea9-62fdac3a3498","Type":"ContainerStarted","Data":"22e7993e33ee7408d2c50ed6e46b420723b94251d1725ae4006571e38f2e50eb"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.598752 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.621177 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" event={"ID":"8c4d609a-2d99-44b9-9c86-e20a3965381b","Type":"ContainerStarted","Data":"7f31d6ceb65fce3484ac19bb1c3b8a7afb3b3e0d2f118757716b8d79c08f55b1"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.621214 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.633206 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" event={"ID":"d8a75041-f9ba-4691-9467-f20f9205daa6","Type":"ContainerStarted","Data":"2f01e132a5ec77324597046b4f8826b217b312e3a6d21475f87a395345d676a6"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.640980 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" event={"ID":"7e5b7475-eb3d-4c76-955d-0d9948cf2fe7","Type":"ContainerStarted","Data":"ff967af81853c9fa02eae577231a65c2f264a35472fbe8ba3ebdb375c38794a2"} Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.641020 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.655375 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" podStartSLOduration=3.835094702 podStartE2EDuration="19.655359837s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.144622997 +0000 UTC m=+932.483952263" lastFinishedPulling="2026-02-23 06:56:35.964888133 +0000 UTC m=+948.304217398" observedRunningTime="2026-02-23 06:56:37.591665108 +0000 UTC m=+949.930994374" watchObservedRunningTime="2026-02-23 06:56:37.655359837 +0000 UTC m=+949.994689103" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.711246 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" podStartSLOduration=4.019246849 podStartE2EDuration="19.711231809s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.3225866 +0000 UTC m=+932.661915866" lastFinishedPulling="2026-02-23 06:56:36.014571559 +0000 UTC m=+948.353900826" observedRunningTime="2026-02-23 06:56:37.661733852 +0000 UTC m=+950.001063119" watchObservedRunningTime="2026-02-23 06:56:37.711231809 +0000 UTC m=+950.050561076" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.784579 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" podStartSLOduration=3.813787366 podStartE2EDuration="19.784562182s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:19.988308563 +0000 UTC m=+932.327637830" lastFinishedPulling="2026-02-23 06:56:35.959083379 +0000 UTC m=+948.298412646" observedRunningTime="2026-02-23 06:56:37.779737978 +0000 UTC m=+950.119067245" watchObservedRunningTime="2026-02-23 06:56:37.784562182 +0000 UTC m=+950.123891448" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.785207 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" podStartSLOduration=5.743774584 podStartE2EDuration="20.785200906s" podCreationTimestamp="2026-02-23 06:56:17 +0000 UTC" firstStartedPulling="2026-02-23 06:56:19.516002264 +0000 UTC m=+931.855331529" lastFinishedPulling="2026-02-23 06:56:34.557428585 +0000 UTC m=+946.896757851" observedRunningTime="2026-02-23 06:56:37.709093177 +0000 UTC m=+950.048422444" watchObservedRunningTime="2026-02-23 06:56:37.785200906 +0000 UTC m=+950.124530172" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.824478 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" podStartSLOduration=4.6890251880000005 podStartE2EDuration="19.824447288s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.218768306 +0000 UTC m=+932.558097562" lastFinishedPulling="2026-02-23 06:56:35.354190395 +0000 UTC m=+947.693519662" observedRunningTime="2026-02-23 06:56:37.819310045 +0000 UTC m=+950.158639310" watchObservedRunningTime="2026-02-23 06:56:37.824447288 +0000 UTC m=+950.163776554" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.904967 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" podStartSLOduration=4.762188385 podStartE2EDuration="19.904940774s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.181843359 +0000 UTC m=+932.521172625" lastFinishedPulling="2026-02-23 06:56:35.324595748 +0000 UTC m=+947.663925014" observedRunningTime="2026-02-23 06:56:37.870818481 +0000 UTC m=+950.210147747" watchObservedRunningTime="2026-02-23 06:56:37.904940774 +0000 UTC m=+950.244270040" Feb 23 06:56:37 crc kubenswrapper[4626]: I0223 06:56:37.906909 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" podStartSLOduration=5.7472843099999995 podStartE2EDuration="20.906899406s" podCreationTimestamp="2026-02-23 06:56:17 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.176725321 +0000 UTC m=+932.516054587" lastFinishedPulling="2026-02-23 06:56:35.336340417 +0000 UTC m=+947.675669683" observedRunningTime="2026-02-23 06:56:37.902520772 +0000 UTC m=+950.241850038" watchObservedRunningTime="2026-02-23 06:56:37.906899406 +0000 UTC m=+950.246228672" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.726775 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" event={"ID":"d8a75041-f9ba-4691-9467-f20f9205daa6","Type":"ContainerStarted","Data":"74f02bb0a3c0122789a1baf9757c3bd7dd06dd500c1bf07a9d1cd00bd251eebc"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.728444 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.728560 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" event={"ID":"02bcecce-5716-45c6-8d40-f1da91d26673","Type":"ContainerStarted","Data":"06c62bbb9feae0c89681fb74e5e6c670213fff448aa7e0edaab5eb5b07d8eada"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.730192 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" event={"ID":"7f639fb4-3160-4bd7-ab2a-86ea80cb51ed","Type":"ContainerStarted","Data":"18e0e4f46195f7d18bae3715bcb24bb49936c4dd47c1ea8042ee569a3e231d25"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.730528 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.731818 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" event={"ID":"d9ef444e-448b-4576-960b-5861b7c19720","Type":"ContainerStarted","Data":"3d509aa5306f19145889e6717df3444a889308d4c5ee969c93201f13b0bc4c47"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.731974 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.733734 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" event={"ID":"98f392cd-76ee-4062-8ad1-15608b3658dc","Type":"ContainerStarted","Data":"03346784df8bb9271129820fb9b9a718298c0980697d02e841e6e5a78ac37731"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.733991 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.735057 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" event={"ID":"8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a","Type":"ContainerStarted","Data":"bbf49ab31631f0f81730af2081a8ed96ec51532060cd5c4b9c0af23de3c90af6"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.735178 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.736399 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" event={"ID":"47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5","Type":"ContainerStarted","Data":"1f90fe7509f7f8422704acf2c4588b3792d9c0d5d81d716bc19b9cad235d697f"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.736551 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.737863 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" event={"ID":"cfc5b69c-d190-4f73-a311-c9a371762530","Type":"ContainerStarted","Data":"d2df320c4b285f629496d958a39c6518a5309401b920f41e6947c864fce37081"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.738084 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.739298 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" event={"ID":"6f3e393a-a659-44ae-bad8-6e4ff2d649ce","Type":"ContainerStarted","Data":"01b20f2a14aea9267f7748a10a9b3a597efbfb761cc1d215bfa25b8bcc6c3424"} Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.739449 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.753273 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" podStartSLOduration=19.480052671 podStartE2EDuration="28.753261631s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:36.581556685 +0000 UTC m=+948.920885951" lastFinishedPulling="2026-02-23 06:56:45.854765644 +0000 UTC m=+958.194094911" observedRunningTime="2026-02-23 06:56:46.753202989 +0000 UTC m=+959.092532256" watchObservedRunningTime="2026-02-23 06:56:46.753261631 +0000 UTC m=+959.092590896" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.775738 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" podStartSLOduration=3.263849365 podStartE2EDuration="28.775724214s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.32432672 +0000 UTC m=+932.663655987" lastFinishedPulling="2026-02-23 06:56:45.83620157 +0000 UTC m=+958.175530836" observedRunningTime="2026-02-23 06:56:46.771416184 +0000 UTC m=+959.110745450" watchObservedRunningTime="2026-02-23 06:56:46.775724214 +0000 UTC m=+959.115053480" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.784891 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vzwhb" podStartSLOduration=2.224501221 podStartE2EDuration="27.7848782s" podCreationTimestamp="2026-02-23 06:56:19 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.292944633 +0000 UTC m=+932.632273899" lastFinishedPulling="2026-02-23 06:56:45.853321612 +0000 UTC m=+958.192650878" observedRunningTime="2026-02-23 06:56:46.782319807 +0000 UTC m=+959.121649073" watchObservedRunningTime="2026-02-23 06:56:46.7848782 +0000 UTC m=+959.124207467" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.818136 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" podStartSLOduration=19.400698993 podStartE2EDuration="28.818124582s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:36.438338208 +0000 UTC m=+948.777667474" lastFinishedPulling="2026-02-23 06:56:45.855763797 +0000 UTC m=+958.195093063" observedRunningTime="2026-02-23 06:56:46.802804622 +0000 UTC m=+959.142133888" watchObservedRunningTime="2026-02-23 06:56:46.818124582 +0000 UTC m=+959.157453838" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.818797 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" podStartSLOduration=3.291239616 podStartE2EDuration="28.818792891s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.330143736 +0000 UTC m=+932.669473001" lastFinishedPulling="2026-02-23 06:56:45.85769701 +0000 UTC m=+958.197026276" observedRunningTime="2026-02-23 06:56:46.81682377 +0000 UTC m=+959.156153026" watchObservedRunningTime="2026-02-23 06:56:46.818792891 +0000 UTC m=+959.158122157" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.836084 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" podStartSLOduration=2.955107595 podStartE2EDuration="28.836064259s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:19.978202803 +0000 UTC m=+932.317532069" lastFinishedPulling="2026-02-23 06:56:45.859159467 +0000 UTC m=+958.198488733" observedRunningTime="2026-02-23 06:56:46.835562483 +0000 UTC m=+959.174891749" watchObservedRunningTime="2026-02-23 06:56:46.836064259 +0000 UTC m=+959.175393526" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.859134 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" podStartSLOduration=3.269640852 podStartE2EDuration="28.859115012s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.245986814 +0000 UTC m=+932.585316080" lastFinishedPulling="2026-02-23 06:56:45.835460974 +0000 UTC m=+958.174790240" observedRunningTime="2026-02-23 06:56:46.854167906 +0000 UTC m=+959.193497172" watchObservedRunningTime="2026-02-23 06:56:46.859115012 +0000 UTC m=+959.198444278" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.876633 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" podStartSLOduration=3.36714781 podStartE2EDuration="28.876614239s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.325227799 +0000 UTC m=+932.664557065" lastFinishedPulling="2026-02-23 06:56:45.834694229 +0000 UTC m=+958.174023494" observedRunningTime="2026-02-23 06:56:46.870150884 +0000 UTC m=+959.209480151" watchObservedRunningTime="2026-02-23 06:56:46.876614239 +0000 UTC m=+959.215943504" Feb 23 06:56:46 crc kubenswrapper[4626]: I0223 06:56:46.888070 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" podStartSLOduration=3.359174389 podStartE2EDuration="28.888051329s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="2026-02-23 06:56:20.329872164 +0000 UTC m=+932.669201430" lastFinishedPulling="2026-02-23 06:56:45.858749104 +0000 UTC m=+958.198078370" observedRunningTime="2026-02-23 06:56:46.881521028 +0000 UTC m=+959.220850294" watchObservedRunningTime="2026-02-23 06:56:46.888051329 +0000 UTC m=+959.227380595" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.305142 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-r6gc2" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.361563 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-6h2w7" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.431748 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-jlcjb" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.452552 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzl5c" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.536630 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-qqpl6" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.549338 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zsrhb" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.621288 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-867lr" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.735726 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-g6gzq" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.918224 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-8nqw2" Feb 23 06:56:48 crc kubenswrapper[4626]: I0223 06:56:48.959235 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-j9cqz" Feb 23 06:56:49 crc kubenswrapper[4626]: I0223 06:56:49.074572 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-db9df" Feb 23 06:56:49 crc kubenswrapper[4626]: I0223 06:56:49.192442 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-qfd84" Feb 23 06:56:49 crc kubenswrapper[4626]: I0223 06:56:49.221534 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ljvjt" Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.068216 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.075335 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fab598e9-890d-4f24-b26b-8f5b507a86c8-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-mf52c\" (UID: \"fab598e9-890d-4f24-b26b-8f5b507a86c8\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.226298 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qm287" Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.234854 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.644921 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c"] Feb 23 06:56:51 crc kubenswrapper[4626]: W0223 06:56:51.647085 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab598e9_890d_4f24_b26b_8f5b507a86c8.slice/crio-927f54a80ebfc673a8dd3f59f98bb64fdde020b9c988e2a955dd775fe6a5e535 WatchSource:0}: Error finding container 927f54a80ebfc673a8dd3f59f98bb64fdde020b9c988e2a955dd775fe6a5e535: Status 404 returned error can't find the container with id 927f54a80ebfc673a8dd3f59f98bb64fdde020b9c988e2a955dd775fe6a5e535 Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.777944 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" event={"ID":"fab598e9-890d-4f24-b26b-8f5b507a86c8","Type":"ContainerStarted","Data":"ddfdbc77cc724776a5f3b6609fed6a114f91b105c021728ae29ae8504017e2bf"} Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.778461 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.778477 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" event={"ID":"fab598e9-890d-4f24-b26b-8f5b507a86c8","Type":"ContainerStarted","Data":"927f54a80ebfc673a8dd3f59f98bb64fdde020b9c988e2a955dd775fe6a5e535"} Feb 23 06:56:51 crc kubenswrapper[4626]: I0223 06:56:51.805454 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" podStartSLOduration=33.805428649 podStartE2EDuration="33.805428649s" podCreationTimestamp="2026-02-23 06:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:56:51.798532239 +0000 UTC m=+964.137861515" watchObservedRunningTime="2026-02-23 06:56:51.805428649 +0000 UTC m=+964.144757916" Feb 23 06:56:54 crc kubenswrapper[4626]: I0223 06:56:54.100610 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-6qn7p" Feb 23 06:56:54 crc kubenswrapper[4626]: I0223 06:56:54.612998 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75" Feb 23 06:56:55 crc kubenswrapper[4626]: I0223 06:56:55.686480 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:56:55 crc kubenswrapper[4626]: I0223 06:56:55.687379 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:56:58 crc kubenswrapper[4626]: I0223 06:56:58.799202 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-mnbrh" Feb 23 06:56:58 crc kubenswrapper[4626]: I0223 06:56:58.966968 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-vxbs9" Feb 23 06:56:59 crc kubenswrapper[4626]: I0223 06:56:59.062237 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-j524z" Feb 23 06:56:59 crc kubenswrapper[4626]: I0223 06:56:59.141847 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-hv75x" Feb 23 06:56:59 crc kubenswrapper[4626]: I0223 06:56:59.168186 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-tzh2l" Feb 23 06:56:59 crc kubenswrapper[4626]: I0223 06:56:59.182341 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-vdx4s" Feb 23 06:57:01 crc kubenswrapper[4626]: I0223 06:57:01.242180 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-mf52c" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.201780 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d4f84b66f-7fvpm"] Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.204814 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.209235 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.209642 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.209700 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-w824n" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.214139 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.219003 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d4f84b66f-7fvpm"] Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.281234 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lffb\" (UniqueName: \"kubernetes.io/projected/40686c3c-d7b0-4e55-9e2f-87a301b01878-kube-api-access-9lffb\") pod \"dnsmasq-dns-d4f84b66f-7fvpm\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.281315 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40686c3c-d7b0-4e55-9e2f-87a301b01878-config\") pod \"dnsmasq-dns-d4f84b66f-7fvpm\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.289289 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7969ff7869-6vz6l"] Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.290581 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.292471 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.306986 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7969ff7869-6vz6l"] Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.383195 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40686c3c-d7b0-4e55-9e2f-87a301b01878-config\") pod \"dnsmasq-dns-d4f84b66f-7fvpm\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.383286 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-dns-svc\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.383328 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-config\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.383351 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz9gn\" (UniqueName: \"kubernetes.io/projected/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-kube-api-access-kz9gn\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.383398 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lffb\" (UniqueName: \"kubernetes.io/projected/40686c3c-d7b0-4e55-9e2f-87a301b01878-kube-api-access-9lffb\") pod \"dnsmasq-dns-d4f84b66f-7fvpm\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.384602 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40686c3c-d7b0-4e55-9e2f-87a301b01878-config\") pod \"dnsmasq-dns-d4f84b66f-7fvpm\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.405341 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lffb\" (UniqueName: \"kubernetes.io/projected/40686c3c-d7b0-4e55-9e2f-87a301b01878-kube-api-access-9lffb\") pod \"dnsmasq-dns-d4f84b66f-7fvpm\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.484812 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-dns-svc\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.484931 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-config\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.484968 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz9gn\" (UniqueName: \"kubernetes.io/projected/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-kube-api-access-kz9gn\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.485717 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-dns-svc\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.486480 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-config\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.498167 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz9gn\" (UniqueName: \"kubernetes.io/projected/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-kube-api-access-kz9gn\") pod \"dnsmasq-dns-7969ff7869-6vz6l\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.521293 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.608028 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.774564 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d4f84b66f-7fvpm"] Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.899492 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7969ff7869-6vz6l"] Feb 23 06:57:15 crc kubenswrapper[4626]: W0223 06:57:15.908851 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6f1af0_99c0_47ba_95f5_21e5204b7be1.slice/crio-62c712f5d82c0a56ad478c9e6f1762048a0246bfa07c071a488978c3b83acf31 WatchSource:0}: Error finding container 62c712f5d82c0a56ad478c9e6f1762048a0246bfa07c071a488978c3b83acf31: Status 404 returned error can't find the container with id 62c712f5d82c0a56ad478c9e6f1762048a0246bfa07c071a488978c3b83acf31 Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.951886 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" event={"ID":"40686c3c-d7b0-4e55-9e2f-87a301b01878","Type":"ContainerStarted","Data":"fb6bafa94eb8816e5c452c283ef8a31e984853ba4002ba01ca20d624b0d7a942"} Feb 23 06:57:15 crc kubenswrapper[4626]: I0223 06:57:15.952599 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" event={"ID":"bb6f1af0-99c0-47ba-95f5-21e5204b7be1","Type":"ContainerStarted","Data":"62c712f5d82c0a56ad478c9e6f1762048a0246bfa07c071a488978c3b83acf31"} Feb 23 06:57:17 crc kubenswrapper[4626]: I0223 06:57:17.983345 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7969ff7869-6vz6l"] Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.010242 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d79569fc7-qm48w"] Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.012075 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.088209 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79569fc7-qm48w"] Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.129065 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-dns-svc\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.129118 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-config\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.129156 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2djr\" (UniqueName: \"kubernetes.io/projected/2788c470-7610-45b4-a588-15d30d34acc7-kube-api-access-s2djr\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.231129 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-dns-svc\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.231173 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-config\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.231206 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2djr\" (UniqueName: \"kubernetes.io/projected/2788c470-7610-45b4-a588-15d30d34acc7-kube-api-access-s2djr\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.232266 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-dns-svc\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.232751 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-config\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.274068 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2djr\" (UniqueName: \"kubernetes.io/projected/2788c470-7610-45b4-a588-15d30d34acc7-kube-api-access-s2djr\") pod \"dnsmasq-dns-5d79569fc7-qm48w\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.355463 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d4f84b66f-7fvpm"] Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.369167 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.371724 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66df45869f-zc57z"] Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.373247 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.379337 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66df45869f-zc57z"] Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.448790 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-dns-svc\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.449103 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-config\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.449162 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkmzt\" (UniqueName: \"kubernetes.io/projected/849db342-900a-45c9-9edd-1f2b180a324e-kube-api-access-zkmzt\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.551192 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-config\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.551336 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkmzt\" (UniqueName: \"kubernetes.io/projected/849db342-900a-45c9-9edd-1f2b180a324e-kube-api-access-zkmzt\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.551487 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-dns-svc\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.552435 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-dns-svc\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.554044 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-config\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.578249 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkmzt\" (UniqueName: \"kubernetes.io/projected/849db342-900a-45c9-9edd-1f2b180a324e-kube-api-access-zkmzt\") pod \"dnsmasq-dns-66df45869f-zc57z\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.718860 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:18 crc kubenswrapper[4626]: I0223 06:57:18.981788 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d79569fc7-qm48w"] Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.024644 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" event={"ID":"2788c470-7610-45b4-a588-15d30d34acc7","Type":"ContainerStarted","Data":"e43888ab753eaa3da91192588a5365040275b17442b4b5f223a70fb93bdae0b5"} Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.126052 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66df45869f-zc57z"] Feb 23 06:57:19 crc kubenswrapper[4626]: W0223 06:57:19.129814 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod849db342_900a_45c9_9edd_1f2b180a324e.slice/crio-f192f4494b3e7c45fe7ceb5690251f49ee54674d4943538edaa326adbbea6a00 WatchSource:0}: Error finding container f192f4494b3e7c45fe7ceb5690251f49ee54674d4943538edaa326adbbea6a00: Status 404 returned error can't find the container with id f192f4494b3e7c45fe7ceb5690251f49ee54674d4943538edaa326adbbea6a00 Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.182718 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.184250 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.189136 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.189355 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.189555 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.189143 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.192079 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.192753 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.197770 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.197911 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6mg66" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260106 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260190 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb16cec1-24fc-4504-8968-0c3fb8368f27-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260236 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260276 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260292 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nrck\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-kube-api-access-9nrck\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260311 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260330 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260345 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260360 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260381 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb16cec1-24fc-4504-8968-0c3fb8368f27-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.260408 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363299 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363617 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb16cec1-24fc-4504-8968-0c3fb8368f27-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363650 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363675 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363688 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nrck\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-kube-api-access-9nrck\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363705 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363722 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363741 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363775 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb16cec1-24fc-4504-8968-0c3fb8368f27-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363804 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.363775 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.364063 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.374682 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.383922 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.386368 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.386424 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.386570 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.388821 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nrck\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-kube-api-access-9nrck\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.389555 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb16cec1-24fc-4504-8968-0c3fb8368f27-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.392929 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb16cec1-24fc-4504-8968-0c3fb8368f27-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.393304 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.395864 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.477534 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.479926 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.485858 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.486084 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.486124 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.486427 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lhrrt" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.486439 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.486723 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.498524 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.498909 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.534953 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589118 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589194 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589234 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7fk\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-kube-api-access-lp7fk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589268 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589282 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589299 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589314 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589339 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589357 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589375 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.589395 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.693791 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.693834 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.693865 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.693887 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.693948 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.693979 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.694018 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.694061 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.694134 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.694165 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.694229 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7fk\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-kube-api-access-lp7fk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.695158 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.695332 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.695551 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.695735 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.695991 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.697148 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.697176 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.700241 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.700461 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.700813 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.714336 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7fk\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-kube-api-access-lp7fk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.724579 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:19 crc kubenswrapper[4626]: I0223 06:57:19.804982 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.046382 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66df45869f-zc57z" event={"ID":"849db342-900a-45c9-9edd-1f2b180a324e","Type":"ContainerStarted","Data":"f192f4494b3e7c45fe7ceb5690251f49ee54674d4943538edaa326adbbea6a00"} Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.095781 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 06:57:20 crc kubenswrapper[4626]: W0223 06:57:20.123292 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb16cec1_24fc_4504_8968_0c3fb8368f27.slice/crio-07e6b3b1e7dbb1272dea499e9b8a60d0f849d9ebcd0c9c88076e33a9ec0d35cc WatchSource:0}: Error finding container 07e6b3b1e7dbb1272dea499e9b8a60d0f849d9ebcd0c9c88076e33a9ec0d35cc: Status 404 returned error can't find the container with id 07e6b3b1e7dbb1272dea499e9b8a60d0f849d9ebcd0c9c88076e33a9ec0d35cc Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.313771 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.572384 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.599857 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.600072 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.605140 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.605371 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.605577 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-k5j8g" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.609365 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.623445 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea075-8063-4feb-8e91-3160073129ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.627196 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn4bs\" (UniqueName: \"kubernetes.io/projected/3b2ea075-8063-4feb-8e91-3160073129ff-kube-api-access-sn4bs\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.627362 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2ea075-8063-4feb-8e91-3160073129ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.627461 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.631438 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b2ea075-8063-4feb-8e91-3160073129ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.631575 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.631676 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.631774 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.629093 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.733997 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b2ea075-8063-4feb-8e91-3160073129ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734073 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734123 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734187 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734245 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea075-8063-4feb-8e91-3160073129ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734290 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn4bs\" (UniqueName: \"kubernetes.io/projected/3b2ea075-8063-4feb-8e91-3160073129ff-kube-api-access-sn4bs\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734344 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2ea075-8063-4feb-8e91-3160073129ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734662 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.734882 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3b2ea075-8063-4feb-8e91-3160073129ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.735659 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.735725 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.736155 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.739374 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b2ea075-8063-4feb-8e91-3160073129ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.741675 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b2ea075-8063-4feb-8e91-3160073129ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.753307 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b2ea075-8063-4feb-8e91-3160073129ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.754604 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn4bs\" (UniqueName: \"kubernetes.io/projected/3b2ea075-8063-4feb-8e91-3160073129ff-kube-api-access-sn4bs\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.759672 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"3b2ea075-8063-4feb-8e91-3160073129ff\") " pod="openstack/openstack-galera-0" Feb 23 06:57:20 crc kubenswrapper[4626]: I0223 06:57:20.931899 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.109901 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1","Type":"ContainerStarted","Data":"101e15c404b4c18fb56512a2757067670b794ab90b07cb9e0d900b121b695d02"} Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.117976 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb16cec1-24fc-4504-8968-0c3fb8368f27","Type":"ContainerStarted","Data":"07e6b3b1e7dbb1272dea499e9b8a60d0f849d9ebcd0c9c88076e33a9ec0d35cc"} Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.469115 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.938011 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.947272 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.947372 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.953287 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.953641 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gbrpn" Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.953854 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 23 06:57:21 crc kubenswrapper[4626]: I0223 06:57:21.954017 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064560 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064657 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a2b658-d642-449c-be58-94b80484618e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064690 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a2b658-d642-449c-be58-94b80484618e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064736 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064810 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a2b658-d642-449c-be58-94b80484618e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064830 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064858 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7gnw\" (UniqueName: \"kubernetes.io/projected/03a2b658-d642-449c-be58-94b80484618e-kube-api-access-h7gnw\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.064879 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171175 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171297 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a2b658-d642-449c-be58-94b80484618e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171324 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a2b658-d642-449c-be58-94b80484618e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171380 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171462 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a2b658-d642-449c-be58-94b80484618e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171481 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171528 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7gnw\" (UniqueName: \"kubernetes.io/projected/03a2b658-d642-449c-be58-94b80484618e-kube-api-access-h7gnw\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.171558 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.172459 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.172567 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3b2ea075-8063-4feb-8e91-3160073129ff","Type":"ContainerStarted","Data":"39d676e12549fb24ae9cf9d866a84049526c6f1c84f94afd0fe40276e12af73b"} Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.173327 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.173633 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a2b658-d642-449c-be58-94b80484618e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.180701 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.182548 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a2b658-d642-449c-be58-94b80484618e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.192719 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a2b658-d642-449c-be58-94b80484618e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.208389 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a2b658-d642-449c-be58-94b80484618e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.213481 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7gnw\" (UniqueName: \"kubernetes.io/projected/03a2b658-d642-449c-be58-94b80484618e-kube-api-access-h7gnw\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.216813 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.224962 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.234467 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.236149 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.236449 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.236658 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-g5qgh" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.264352 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"03a2b658-d642-449c-be58-94b80484618e\") " pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.274309 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/38df29c3-b467-498f-9ec1-a83cc91c27ca-memcached-tls-certs\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.274359 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df29c3-b467-498f-9ec1-a83cc91c27ca-combined-ca-bundle\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.274407 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38df29c3-b467-498f-9ec1-a83cc91c27ca-kolla-config\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.274465 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38df29c3-b467-498f-9ec1-a83cc91c27ca-config-data\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.274564 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gs4x\" (UniqueName: \"kubernetes.io/projected/38df29c3-b467-498f-9ec1-a83cc91c27ca-kube-api-access-9gs4x\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.292383 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.386445 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/38df29c3-b467-498f-9ec1-a83cc91c27ca-memcached-tls-certs\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.386516 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df29c3-b467-498f-9ec1-a83cc91c27ca-combined-ca-bundle\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.386556 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38df29c3-b467-498f-9ec1-a83cc91c27ca-kolla-config\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.386601 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38df29c3-b467-498f-9ec1-a83cc91c27ca-config-data\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.386655 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gs4x\" (UniqueName: \"kubernetes.io/projected/38df29c3-b467-498f-9ec1-a83cc91c27ca-kube-api-access-9gs4x\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.388708 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38df29c3-b467-498f-9ec1-a83cc91c27ca-kolla-config\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.399660 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38df29c3-b467-498f-9ec1-a83cc91c27ca-config-data\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.403187 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gs4x\" (UniqueName: \"kubernetes.io/projected/38df29c3-b467-498f-9ec1-a83cc91c27ca-kube-api-access-9gs4x\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.406151 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/38df29c3-b467-498f-9ec1-a83cc91c27ca-memcached-tls-certs\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.407182 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df29c3-b467-498f-9ec1-a83cc91c27ca-combined-ca-bundle\") pod \"memcached-0\" (UID: \"38df29c3-b467-498f-9ec1-a83cc91c27ca\") " pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.596082 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.881674 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 06:57:22 crc kubenswrapper[4626]: I0223 06:57:22.940097 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 06:57:23 crc kubenswrapper[4626]: I0223 06:57:23.194348 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03a2b658-d642-449c-be58-94b80484618e","Type":"ContainerStarted","Data":"03b0c97198ae1ebb705b64b5bce37e453877435fd720a45c520287918c3e7849"} Feb 23 06:57:23 crc kubenswrapper[4626]: I0223 06:57:23.200941 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"38df29c3-b467-498f-9ec1-a83cc91c27ca","Type":"ContainerStarted","Data":"191776e722458798d66bcb1ed14f84fc3f83e80e5fd5f34c09bfc86a4830d7d5"} Feb 23 06:57:24 crc kubenswrapper[4626]: I0223 06:57:24.843897 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 06:57:24 crc kubenswrapper[4626]: I0223 06:57:24.845884 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 06:57:24 crc kubenswrapper[4626]: I0223 06:57:24.862475 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-89vv9" Feb 23 06:57:24 crc kubenswrapper[4626]: I0223 06:57:24.873530 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 06:57:24 crc kubenswrapper[4626]: I0223 06:57:24.964336 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrlw7\" (UniqueName: \"kubernetes.io/projected/c8ff8af3-00d8-482b-9054-fa9b2f3bc766-kube-api-access-lrlw7\") pod \"kube-state-metrics-0\" (UID: \"c8ff8af3-00d8-482b-9054-fa9b2f3bc766\") " pod="openstack/kube-state-metrics-0" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.066458 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrlw7\" (UniqueName: \"kubernetes.io/projected/c8ff8af3-00d8-482b-9054-fa9b2f3bc766-kube-api-access-lrlw7\") pod \"kube-state-metrics-0\" (UID: \"c8ff8af3-00d8-482b-9054-fa9b2f3bc766\") " pod="openstack/kube-state-metrics-0" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.096356 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrlw7\" (UniqueName: \"kubernetes.io/projected/c8ff8af3-00d8-482b-9054-fa9b2f3bc766-kube-api-access-lrlw7\") pod \"kube-state-metrics-0\" (UID: \"c8ff8af3-00d8-482b-9054-fa9b2f3bc766\") " pod="openstack/kube-state-metrics-0" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.196747 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.686652 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.686957 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.687002 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.687450 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3524b229b91a471ae481a5ae0ae84698672cefa6dfcb3ba75eb9b84f9fff35b3"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.687527 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://3524b229b91a471ae481a5ae0ae84698672cefa6dfcb3ba75eb9b84f9fff35b3" gracePeriod=600 Feb 23 06:57:25 crc kubenswrapper[4626]: I0223 06:57:25.775284 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 06:57:26 crc kubenswrapper[4626]: I0223 06:57:26.294774 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8ff8af3-00d8-482b-9054-fa9b2f3bc766","Type":"ContainerStarted","Data":"b4566f124eb2b1c79abc3b65f023ab3570795a37fbd2b8b29ed0e5d80542189d"} Feb 23 06:57:26 crc kubenswrapper[4626]: I0223 06:57:26.303156 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="3524b229b91a471ae481a5ae0ae84698672cefa6dfcb3ba75eb9b84f9fff35b3" exitCode=0 Feb 23 06:57:26 crc kubenswrapper[4626]: I0223 06:57:26.303191 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"3524b229b91a471ae481a5ae0ae84698672cefa6dfcb3ba75eb9b84f9fff35b3"} Feb 23 06:57:26 crc kubenswrapper[4626]: I0223 06:57:26.303238 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"a5470378b5d8c9e6d24ed5c140362129ba764fb086113e814588933f56ca4c24"} Feb 23 06:57:26 crc kubenswrapper[4626]: I0223 06:57:26.303258 4626 scope.go:117] "RemoveContainer" containerID="e82542f36ca6c092b0099e501573dbf83452c1447d038b617beda611bf3799cf" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.756880 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9j9vm"] Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.762560 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.766639 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zzzw8" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.766827 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.766921 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.795781 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9j9vm"] Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.853967 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-log-ovn\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.854034 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db3f96-4a68-44bd-82ff-076ba32d9066-combined-ca-bundle\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.854072 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-run-ovn\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.854192 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-run\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.854240 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61db3f96-4a68-44bd-82ff-076ba32d9066-scripts\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.854260 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db3f96-4a68-44bd-82ff-076ba32d9066-ovn-controller-tls-certs\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.854349 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66k74\" (UniqueName: \"kubernetes.io/projected/61db3f96-4a68-44bd-82ff-076ba32d9066-kube-api-access-66k74\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.893428 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-574ch"] Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.896248 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.919938 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-574ch"] Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.956638 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-log-ovn\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.956830 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db3f96-4a68-44bd-82ff-076ba32d9066-combined-ca-bundle\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.956983 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-run-ovn\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.961067 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd241f0-61b5-4185-a928-41cf22745048-scripts\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.961191 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-etc-ovs\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.961297 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-run\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.961441 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-run\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.961602 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61db3f96-4a68-44bd-82ff-076ba32d9066-scripts\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.963534 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db3f96-4a68-44bd-82ff-076ba32d9066-ovn-controller-tls-certs\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.963731 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-log\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.964824 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61db3f96-4a68-44bd-82ff-076ba32d9066-scripts\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.964942 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-lib\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.965037 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66k74\" (UniqueName: \"kubernetes.io/projected/61db3f96-4a68-44bd-82ff-076ba32d9066-kube-api-access-66k74\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.965126 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbq6\" (UniqueName: \"kubernetes.io/projected/afd241f0-61b5-4185-a928-41cf22745048-kube-api-access-kkbq6\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.957210 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-run-ovn\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.957097 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-log-ovn\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.965820 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/61db3f96-4a68-44bd-82ff-076ba32d9066-var-run\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.983068 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/61db3f96-4a68-44bd-82ff-076ba32d9066-ovn-controller-tls-certs\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.983265 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db3f96-4a68-44bd-82ff-076ba32d9066-combined-ca-bundle\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:27 crc kubenswrapper[4626]: I0223 06:57:27.990751 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66k74\" (UniqueName: \"kubernetes.io/projected/61db3f96-4a68-44bd-82ff-076ba32d9066-kube-api-access-66k74\") pod \"ovn-controller-9j9vm\" (UID: \"61db3f96-4a68-44bd-82ff-076ba32d9066\") " pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068572 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd241f0-61b5-4185-a928-41cf22745048-scripts\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068646 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-etc-ovs\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068683 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-run\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068828 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-log\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068887 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-lib\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068918 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbq6\" (UniqueName: \"kubernetes.io/projected/afd241f0-61b5-4185-a928-41cf22745048-kube-api-access-kkbq6\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.068991 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-run\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.071833 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-log\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.071975 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-etc-ovs\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.072355 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/afd241f0-61b5-4185-a928-41cf22745048-var-lib\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.074058 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd241f0-61b5-4185-a928-41cf22745048-scripts\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.079990 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.106158 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbq6\" (UniqueName: \"kubernetes.io/projected/afd241f0-61b5-4185-a928-41cf22745048-kube-api-access-kkbq6\") pod \"ovn-controller-ovs-574ch\" (UID: \"afd241f0-61b5-4185-a928-41cf22745048\") " pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.229949 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.443980 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.445085 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.448848 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dlkpf" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.452463 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.453140 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.453385 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.457452 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.489253 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580419 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b918ead-8c70-463f-b938-948436aa4278-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580461 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b918ead-8c70-463f-b938-948436aa4278-config\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580558 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580626 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580664 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xk2p\" (UniqueName: \"kubernetes.io/projected/6b918ead-8c70-463f-b938-948436aa4278-kube-api-access-9xk2p\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580695 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580721 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b918ead-8c70-463f-b938-948436aa4278-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.580757 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.683386 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.683446 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xk2p\" (UniqueName: \"kubernetes.io/projected/6b918ead-8c70-463f-b938-948436aa4278-kube-api-access-9xk2p\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.683480 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.684134 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b918ead-8c70-463f-b938-948436aa4278-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.684180 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.684205 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b918ead-8c70-463f-b938-948436aa4278-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.684215 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.685211 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b918ead-8c70-463f-b938-948436aa4278-config\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.684223 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b918ead-8c70-463f-b938-948436aa4278-config\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.685699 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.686189 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b918ead-8c70-463f-b938-948436aa4278-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.686355 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b918ead-8c70-463f-b938-948436aa4278-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.689455 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.697777 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.698598 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xk2p\" (UniqueName: \"kubernetes.io/projected/6b918ead-8c70-463f-b938-948436aa4278-kube-api-access-9xk2p\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.710230 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b918ead-8c70-463f-b938-948436aa4278-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.716276 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b918ead-8c70-463f-b938-948436aa4278\") " pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:28 crc kubenswrapper[4626]: I0223 06:57:28.775219 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:30 crc kubenswrapper[4626]: I0223 06:57:30.983798 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 06:57:30 crc kubenswrapper[4626]: I0223 06:57:30.988604 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:30 crc kubenswrapper[4626]: I0223 06:57:30.994260 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 23 06:57:30 crc kubenswrapper[4626]: I0223 06:57:30.994539 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 23 06:57:30 crc kubenswrapper[4626]: I0223 06:57:30.994881 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bbck8" Feb 23 06:57:30 crc kubenswrapper[4626]: I0223 06:57:30.995086 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.015984 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059389 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059448 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f76a96be-520f-46e8-9e47-4a4d3237359e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059506 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f76a96be-520f-46e8-9e47-4a4d3237359e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059562 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059798 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059854 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjm8x\" (UniqueName: \"kubernetes.io/projected/f76a96be-520f-46e8-9e47-4a4d3237359e-kube-api-access-mjm8x\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059874 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.059915 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f76a96be-520f-46e8-9e47-4a4d3237359e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.162070 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.162406 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjm8x\" (UniqueName: \"kubernetes.io/projected/f76a96be-520f-46e8-9e47-4a4d3237359e-kube-api-access-mjm8x\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.162448 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.162447 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.162738 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f76a96be-520f-46e8-9e47-4a4d3237359e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.164031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f76a96be-520f-46e8-9e47-4a4d3237359e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.164402 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.165054 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f76a96be-520f-46e8-9e47-4a4d3237359e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.165865 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f76a96be-520f-46e8-9e47-4a4d3237359e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.169090 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f76a96be-520f-46e8-9e47-4a4d3237359e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.169195 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.170063 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f76a96be-520f-46e8-9e47-4a4d3237359e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.172345 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.173513 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.182553 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76a96be-520f-46e8-9e47-4a4d3237359e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.189833 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.190520 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjm8x\" (UniqueName: \"kubernetes.io/projected/f76a96be-520f-46e8-9e47-4a4d3237359e-kube-api-access-mjm8x\") pod \"ovsdbserver-sb-0\" (UID: \"f76a96be-520f-46e8-9e47-4a4d3237359e\") " pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:31 crc kubenswrapper[4626]: I0223 06:57:31.329044 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:35 crc kubenswrapper[4626]: I0223 06:57:35.317148 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9j9vm"] Feb 23 06:57:35 crc kubenswrapper[4626]: I0223 06:57:35.855066 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-574ch"] Feb 23 06:57:36 crc kubenswrapper[4626]: I0223 06:57:36.507563 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9j9vm" event={"ID":"61db3f96-4a68-44bd-82ff-076ba32d9066","Type":"ContainerStarted","Data":"3e4472277ab1b6d859ee3aa7e4da5e7854862f01de37c4da5a897b553c45750b"} Feb 23 06:57:40 crc kubenswrapper[4626]: I0223 06:57:40.556785 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-574ch" event={"ID":"afd241f0-61b5-4185-a928-41cf22745048","Type":"ContainerStarted","Data":"afe97a1cb9ad5b0cef18873de475c85e95a2df7e6d459df7049e48bd51974285"} Feb 23 06:57:47 crc kubenswrapper[4626]: E0223 06:57:47.041461 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:47 crc kubenswrapper[4626]: E0223 06:57:47.042222 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:47 crc kubenswrapper[4626]: E0223 06:57:47.042420 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h7gnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(03a2b658-d642-449c-be58-94b80484618e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:47 crc kubenswrapper[4626]: E0223 06:57:47.043655 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="03a2b658-d642-449c-be58-94b80484618e" Feb 23 06:57:47 crc kubenswrapper[4626]: E0223 06:57:47.613462 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="03a2b658-d642-449c-be58-94b80484618e" Feb 23 06:57:48 crc kubenswrapper[4626]: E0223 06:57:48.139333 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:48 crc kubenswrapper[4626]: E0223 06:57:48.139390 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:48 crc kubenswrapper[4626]: E0223 06:57:48.139593 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nrck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(cb16cec1-24fc-4504-8968-0c3fb8368f27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:48 crc kubenswrapper[4626]: E0223 06:57:48.141372 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" Feb 23 06:57:48 crc kubenswrapper[4626]: E0223 06:57:48.620946 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/rabbitmq-server-0" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.013470 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.013884 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.014132 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zkmzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-66df45869f-zc57z_openstack(849db342-900a-45c9-9edd-1f2b180a324e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.015373 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-66df45869f-zc57z" podUID="849db342-900a-45c9-9edd-1f2b180a324e" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.025098 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.025170 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.025329 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lffb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-d4f84b66f-7fvpm_openstack(40686c3c-d7b0-4e55-9e2f-87a301b01878): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.026535 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" podUID="40686c3c-d7b0-4e55-9e2f-87a301b01878" Feb 23 06:57:49 crc kubenswrapper[4626]: E0223 06:57:49.641555 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/dnsmasq-dns-66df45869f-zc57z" podUID="849db342-900a-45c9-9edd-1f2b180a324e" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.340376 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.340684 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.340817 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8419493e1fd846703d277695e03fc5eb,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n9fh685h657h6dh94h664h546h59h9ch57bh658h7ch587h699h69h659h64dh685h554hbch68fh646h5cch644h545h565h5d4h5ddh66dh65bh98h55dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gs4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(38df29c3-b467-498f-9ec1-a83cc91c27ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.342002 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="38df29c3-b467-498f-9ec1-a83cc91c27ca" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.364294 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.364340 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.364472 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lp7fk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.365692 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.459450 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.459572 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.459860 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2djr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d79569fc7-qm48w_openstack(2788c470-7610-45b4-a588-15d30d34acc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.461010 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" podUID="2788c470-7610-45b4-a588-15d30d34acc7" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.478329 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.489984 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.490055 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.490204 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kz9gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7969ff7869-6vz6l_openstack(bb6f1af0-99c0-47ba-95f5-21e5204b7be1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.496709 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" podUID="bb6f1af0-99c0-47ba-95f5-21e5204b7be1" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.517034 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40686c3c-d7b0-4e55-9e2f-87a301b01878-config\") pod \"40686c3c-d7b0-4e55-9e2f-87a301b01878\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.517192 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lffb\" (UniqueName: \"kubernetes.io/projected/40686c3c-d7b0-4e55-9e2f-87a301b01878-kube-api-access-9lffb\") pod \"40686c3c-d7b0-4e55-9e2f-87a301b01878\" (UID: \"40686c3c-d7b0-4e55-9e2f-87a301b01878\") " Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.520083 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40686c3c-d7b0-4e55-9e2f-87a301b01878-config" (OuterVolumeSpecName: "config") pod "40686c3c-d7b0-4e55-9e2f-87a301b01878" (UID: "40686c3c-d7b0-4e55-9e2f-87a301b01878"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.529740 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40686c3c-d7b0-4e55-9e2f-87a301b01878-kube-api-access-9lffb" (OuterVolumeSpecName: "kube-api-access-9lffb") pod "40686c3c-d7b0-4e55-9e2f-87a301b01878" (UID: "40686c3c-d7b0-4e55-9e2f-87a301b01878"). InnerVolumeSpecName "kube-api-access-9lffb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.620657 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lffb\" (UniqueName: \"kubernetes.io/projected/40686c3c-d7b0-4e55-9e2f-87a301b01878-kube-api-access-9lffb\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.620697 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40686c3c-d7b0-4e55-9e2f-87a301b01878-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.644596 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.644944 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d4f84b66f-7fvpm" event={"ID":"40686c3c-d7b0-4e55-9e2f-87a301b01878","Type":"ContainerDied","Data":"fb6bafa94eb8816e5c452c283ef8a31e984853ba4002ba01ca20d624b0d7a942"} Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.647175 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.647538 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/memcached-0" podUID="38df29c3-b467-498f-9ec1-a83cc91c27ca" Feb 23 06:57:50 crc kubenswrapper[4626]: E0223 06:57:50.647615 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" podUID="2788c470-7610-45b4-a588-15d30d34acc7" Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.778429 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d4f84b66f-7fvpm"] Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.780538 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d4f84b66f-7fvpm"] Feb 23 06:57:50 crc kubenswrapper[4626]: I0223 06:57:50.872476 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.051899 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.094746 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-k59cf"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.095854 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.103550 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.118964 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k59cf"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.236948 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/707342b7-0d1b-431f-98bb-99af693f57b2-ovn-rundir\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.237015 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/707342b7-0d1b-431f-98bb-99af693f57b2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.237118 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/707342b7-0d1b-431f-98bb-99af693f57b2-ovs-rundir\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.237155 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/707342b7-0d1b-431f-98bb-99af693f57b2-config\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.237195 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427qz\" (UniqueName: \"kubernetes.io/projected/707342b7-0d1b-431f-98bb-99af693f57b2-kube-api-access-427qz\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.237277 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707342b7-0d1b-431f-98bb-99af693f57b2-combined-ca-bundle\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342146 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/707342b7-0d1b-431f-98bb-99af693f57b2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342293 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/707342b7-0d1b-431f-98bb-99af693f57b2-ovs-rundir\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342333 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/707342b7-0d1b-431f-98bb-99af693f57b2-config\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342397 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427qz\" (UniqueName: \"kubernetes.io/projected/707342b7-0d1b-431f-98bb-99af693f57b2-kube-api-access-427qz\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342444 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707342b7-0d1b-431f-98bb-99af693f57b2-combined-ca-bundle\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342604 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/707342b7-0d1b-431f-98bb-99af693f57b2-ovn-rundir\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.342973 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/707342b7-0d1b-431f-98bb-99af693f57b2-ovn-rundir\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.344279 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/707342b7-0d1b-431f-98bb-99af693f57b2-ovs-rundir\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.344345 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/707342b7-0d1b-431f-98bb-99af693f57b2-config\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.351790 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/707342b7-0d1b-431f-98bb-99af693f57b2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.352192 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707342b7-0d1b-431f-98bb-99af693f57b2-combined-ca-bundle\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.362428 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427qz\" (UniqueName: \"kubernetes.io/projected/707342b7-0d1b-431f-98bb-99af693f57b2-kube-api-access-427qz\") pod \"ovn-controller-metrics-k59cf\" (UID: \"707342b7-0d1b-431f-98bb-99af693f57b2\") " pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.412770 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-k59cf" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.443898 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79569fc7-qm48w"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.480172 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-88448458f-7j4tm"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.481388 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.495996 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.501945 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-88448458f-7j4tm"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.627054 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66df45869f-zc57z"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.647925 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnms8\" (UniqueName: \"kubernetes.io/projected/7f0ca890-098e-4dea-b359-10582f94c059-kube-api-access-gnms8\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.647965 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-config\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.648043 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-ovsdbserver-nb\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.648210 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-dns-svc\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.648915 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8474bb99-4mn6h"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.650080 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.662962 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.682378 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8474bb99-4mn6h"] Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.750803 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.750902 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-config\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.750935 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njfwm\" (UniqueName: \"kubernetes.io/projected/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-kube-api-access-njfwm\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.750991 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-dns-svc\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.751046 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.751153 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-dns-svc\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.751201 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnms8\" (UniqueName: \"kubernetes.io/projected/7f0ca890-098e-4dea-b359-10582f94c059-kube-api-access-gnms8\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.751223 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-config\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.751282 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-ovsdbserver-nb\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.752256 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-ovsdbserver-nb\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.752970 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-dns-svc\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.753827 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-config\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.787939 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnms8\" (UniqueName: \"kubernetes.io/projected/7f0ca890-098e-4dea-b359-10582f94c059-kube-api-access-gnms8\") pod \"dnsmasq-dns-88448458f-7j4tm\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.796718 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.853241 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.853315 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-config\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.853344 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njfwm\" (UniqueName: \"kubernetes.io/projected/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-kube-api-access-njfwm\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.853386 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.853459 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-dns-svc\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.854243 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-dns-svc\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.854323 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-config\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.854932 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.855101 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.868644 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njfwm\" (UniqueName: \"kubernetes.io/projected/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-kube-api-access-njfwm\") pod \"dnsmasq-dns-7b8474bb99-4mn6h\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.968253 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:51 crc kubenswrapper[4626]: I0223 06:57:51.993221 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40686c3c-d7b0-4e55-9e2f-87a301b01878" path="/var/lib/kubelet/pods/40686c3c-d7b0-4e55-9e2f-87a301b01878/volumes" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.125470 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.209552 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.211181 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.236963 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384160 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2djr\" (UniqueName: \"kubernetes.io/projected/2788c470-7610-45b4-a588-15d30d34acc7-kube-api-access-s2djr\") pod \"2788c470-7610-45b4-a588-15d30d34acc7\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384276 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-dns-svc\") pod \"2788c470-7610-45b4-a588-15d30d34acc7\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384349 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz9gn\" (UniqueName: \"kubernetes.io/projected/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-kube-api-access-kz9gn\") pod \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384368 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-dns-svc\") pod \"849db342-900a-45c9-9edd-1f2b180a324e\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384431 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-config\") pod \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384480 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-config\") pod \"849db342-900a-45c9-9edd-1f2b180a324e\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384534 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkmzt\" (UniqueName: \"kubernetes.io/projected/849db342-900a-45c9-9edd-1f2b180a324e-kube-api-access-zkmzt\") pod \"849db342-900a-45c9-9edd-1f2b180a324e\" (UID: \"849db342-900a-45c9-9edd-1f2b180a324e\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384562 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-config\") pod \"2788c470-7610-45b4-a588-15d30d34acc7\" (UID: \"2788c470-7610-45b4-a588-15d30d34acc7\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.384632 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-dns-svc\") pod \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\" (UID: \"bb6f1af0-99c0-47ba-95f5-21e5204b7be1\") " Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.385150 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "849db342-900a-45c9-9edd-1f2b180a324e" (UID: "849db342-900a-45c9-9edd-1f2b180a324e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.385639 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb6f1af0-99c0-47ba-95f5-21e5204b7be1" (UID: "bb6f1af0-99c0-47ba-95f5-21e5204b7be1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.385669 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-config" (OuterVolumeSpecName: "config") pod "bb6f1af0-99c0-47ba-95f5-21e5204b7be1" (UID: "bb6f1af0-99c0-47ba-95f5-21e5204b7be1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.386069 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-config" (OuterVolumeSpecName: "config") pod "849db342-900a-45c9-9edd-1f2b180a324e" (UID: "849db342-900a-45c9-9edd-1f2b180a324e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.386076 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-config" (OuterVolumeSpecName: "config") pod "2788c470-7610-45b4-a588-15d30d34acc7" (UID: "2788c470-7610-45b4-a588-15d30d34acc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.386116 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2788c470-7610-45b4-a588-15d30d34acc7" (UID: "2788c470-7610-45b4-a588-15d30d34acc7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.389197 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849db342-900a-45c9-9edd-1f2b180a324e-kube-api-access-zkmzt" (OuterVolumeSpecName: "kube-api-access-zkmzt") pod "849db342-900a-45c9-9edd-1f2b180a324e" (UID: "849db342-900a-45c9-9edd-1f2b180a324e"). InnerVolumeSpecName "kube-api-access-zkmzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.390015 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-kube-api-access-kz9gn" (OuterVolumeSpecName: "kube-api-access-kz9gn") pod "bb6f1af0-99c0-47ba-95f5-21e5204b7be1" (UID: "bb6f1af0-99c0-47ba-95f5-21e5204b7be1"). InnerVolumeSpecName "kube-api-access-kz9gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.401142 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2788c470-7610-45b4-a588-15d30d34acc7-kube-api-access-s2djr" (OuterVolumeSpecName: "kube-api-access-s2djr") pod "2788c470-7610-45b4-a588-15d30d34acc7" (UID: "2788c470-7610-45b4-a588-15d30d34acc7"). InnerVolumeSpecName "kube-api-access-s2djr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488121 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488166 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz9gn\" (UniqueName: \"kubernetes.io/projected/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-kube-api-access-kz9gn\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488185 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488198 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488211 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849db342-900a-45c9-9edd-1f2b180a324e-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488222 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkmzt\" (UniqueName: \"kubernetes.io/projected/849db342-900a-45c9-9edd-1f2b180a324e-kube-api-access-zkmzt\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488233 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2788c470-7610-45b4-a588-15d30d34acc7-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488244 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6f1af0-99c0-47ba-95f5-21e5204b7be1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.488255 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2djr\" (UniqueName: \"kubernetes.io/projected/2788c470-7610-45b4-a588-15d30d34acc7-kube-api-access-s2djr\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.690661 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f76a96be-520f-46e8-9e47-4a4d3237359e","Type":"ContainerStarted","Data":"9ae52044af25c423e65232b2ef635f89206547221e0181a1e048654a576c320f"} Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.696939 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" event={"ID":"2788c470-7610-45b4-a588-15d30d34acc7","Type":"ContainerDied","Data":"e43888ab753eaa3da91192588a5365040275b17442b4b5f223a70fb93bdae0b5"} Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.697021 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d79569fc7-qm48w" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.705019 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66df45869f-zc57z" event={"ID":"849db342-900a-45c9-9edd-1f2b180a324e","Type":"ContainerDied","Data":"f192f4494b3e7c45fe7ceb5690251f49ee54674d4943538edaa326adbbea6a00"} Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.705035 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66df45869f-zc57z" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.706454 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b918ead-8c70-463f-b938-948436aa4278","Type":"ContainerStarted","Data":"9f3b6613a3c942a1c100539c06850e65ec8efd3a0db7e6d9bb45e22ec60b6e5c"} Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.714294 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" event={"ID":"bb6f1af0-99c0-47ba-95f5-21e5204b7be1","Type":"ContainerDied","Data":"62c712f5d82c0a56ad478c9e6f1762048a0246bfa07c071a488978c3b83acf31"} Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.714409 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7969ff7869-6vz6l" Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.747940 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d79569fc7-qm48w"] Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.763771 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d79569fc7-qm48w"] Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.775382 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66df45869f-zc57z"] Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.778060 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66df45869f-zc57z"] Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.812248 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7969ff7869-6vz6l"] Feb 23 06:57:53 crc kubenswrapper[4626]: I0223 06:57:53.820184 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7969ff7869-6vz6l"] Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.009113 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2788c470-7610-45b4-a588-15d30d34acc7" path="/var/lib/kubelet/pods/2788c470-7610-45b4-a588-15d30d34acc7/volumes" Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.009481 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849db342-900a-45c9-9edd-1f2b180a324e" path="/var/lib/kubelet/pods/849db342-900a-45c9-9edd-1f2b180a324e/volumes" Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.009868 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6f1af0-99c0-47ba-95f5-21e5204b7be1" path="/var/lib/kubelet/pods/bb6f1af0-99c0-47ba-95f5-21e5204b7be1/volumes" Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.010362 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-88448458f-7j4tm"] Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.034713 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-k59cf"] Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.090924 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8474bb99-4mn6h"] Feb 23 06:57:54 crc kubenswrapper[4626]: E0223 06:57:54.248430 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Feb 23 06:57:54 crc kubenswrapper[4626]: E0223 06:57:54.248482 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Feb 23 06:57:54 crc kubenswrapper[4626]: E0223 06:57:54.248636 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lrlw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(c8ff8af3-00d8-482b-9054-fa9b2f3bc766): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:57:54 crc kubenswrapper[4626]: E0223 06:57:54.249831 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" Feb 23 06:57:54 crc kubenswrapper[4626]: W0223 06:57:54.260935 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f0ca890_098e_4dea_b359_10582f94c059.slice/crio-6146c4cfb2d96c95a4587d3299ccc9520a19b17904fbbc95d135f034e5bd1b75 WatchSource:0}: Error finding container 6146c4cfb2d96c95a4587d3299ccc9520a19b17904fbbc95d135f034e5bd1b75: Status 404 returned error can't find the container with id 6146c4cfb2d96c95a4587d3299ccc9520a19b17904fbbc95d135f034e5bd1b75 Feb 23 06:57:54 crc kubenswrapper[4626]: W0223 06:57:54.269673 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64931128_a9a9_43f5_a7ad_6a9d4ba7b642.slice/crio-14287deeda76325826023952347173d4cef16fdf3e87c4f3b6f60d6b89370a91 WatchSource:0}: Error finding container 14287deeda76325826023952347173d4cef16fdf3e87c4f3b6f60d6b89370a91: Status 404 returned error can't find the container with id 14287deeda76325826023952347173d4cef16fdf3e87c4f3b6f60d6b89370a91 Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.737126 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k59cf" event={"ID":"707342b7-0d1b-431f-98bb-99af693f57b2","Type":"ContainerStarted","Data":"c8c8955403c8589e1818bac1cacf8669aa36125e827bd749af9fa7e3b7e936e4"} Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.749017 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9j9vm" event={"ID":"61db3f96-4a68-44bd-82ff-076ba32d9066","Type":"ContainerStarted","Data":"468e83f0fff7b6509067199c7e34c8a86106764ac67929364bc2f2feede210a6"} Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.749418 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9j9vm" Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.754957 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-574ch" event={"ID":"afd241f0-61b5-4185-a928-41cf22745048","Type":"ContainerStarted","Data":"40c5f2a33aecdec911c13f9f49a29ae09aa94dc8f2550a2c09813016693544a3"} Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.759484 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" event={"ID":"64931128-a9a9-43f5-a7ad-6a9d4ba7b642","Type":"ContainerStarted","Data":"14287deeda76325826023952347173d4cef16fdf3e87c4f3b6f60d6b89370a91"} Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.761720 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3b2ea075-8063-4feb-8e91-3160073129ff","Type":"ContainerStarted","Data":"23d9905416a11ee48901f487e33531b9ae355c13a12e99f42302e6d970ddc099"} Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.766024 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-88448458f-7j4tm" event={"ID":"7f0ca890-098e-4dea-b359-10582f94c059","Type":"ContainerStarted","Data":"6146c4cfb2d96c95a4587d3299ccc9520a19b17904fbbc95d135f034e5bd1b75"} Feb 23 06:57:54 crc kubenswrapper[4626]: E0223 06:57:54.775769 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb\\\"\"" pod="openstack/kube-state-metrics-0" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" Feb 23 06:57:54 crc kubenswrapper[4626]: I0223 06:57:54.789949 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9j9vm" podStartSLOduration=10.052775853 podStartE2EDuration="27.789909005s" podCreationTimestamp="2026-02-23 06:57:27 +0000 UTC" firstStartedPulling="2026-02-23 06:57:35.899726948 +0000 UTC m=+1008.239056214" lastFinishedPulling="2026-02-23 06:57:53.6368601 +0000 UTC m=+1025.976189366" observedRunningTime="2026-02-23 06:57:54.769105261 +0000 UTC m=+1027.108434517" watchObservedRunningTime="2026-02-23 06:57:54.789909005 +0000 UTC m=+1027.129238472" Feb 23 06:57:55 crc kubenswrapper[4626]: I0223 06:57:55.781893 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-574ch" event={"ID":"afd241f0-61b5-4185-a928-41cf22745048","Type":"ContainerDied","Data":"40c5f2a33aecdec911c13f9f49a29ae09aa94dc8f2550a2c09813016693544a3"} Feb 23 06:57:55 crc kubenswrapper[4626]: I0223 06:57:55.781846 4626 generic.go:334] "Generic (PLEG): container finished" podID="afd241f0-61b5-4185-a928-41cf22745048" containerID="40c5f2a33aecdec911c13f9f49a29ae09aa94dc8f2550a2c09813016693544a3" exitCode=0 Feb 23 06:57:55 crc kubenswrapper[4626]: I0223 06:57:55.784850 4626 generic.go:334] "Generic (PLEG): container finished" podID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerID="c441b6a0f9ddcf72814a40da14be91ba4883f0cb25ab8d6595e9ae3f23b73f6c" exitCode=0 Feb 23 06:57:55 crc kubenswrapper[4626]: I0223 06:57:55.784937 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" event={"ID":"64931128-a9a9-43f5-a7ad-6a9d4ba7b642","Type":"ContainerDied","Data":"c441b6a0f9ddcf72814a40da14be91ba4883f0cb25ab8d6595e9ae3f23b73f6c"} Feb 23 06:57:55 crc kubenswrapper[4626]: I0223 06:57:55.786626 4626 generic.go:334] "Generic (PLEG): container finished" podID="7f0ca890-098e-4dea-b359-10582f94c059" containerID="6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43" exitCode=0 Feb 23 06:57:55 crc kubenswrapper[4626]: I0223 06:57:55.786661 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-88448458f-7j4tm" event={"ID":"7f0ca890-098e-4dea-b359-10582f94c059","Type":"ContainerDied","Data":"6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43"} Feb 23 06:57:56 crc kubenswrapper[4626]: I0223 06:57:56.798399 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f76a96be-520f-46e8-9e47-4a4d3237359e","Type":"ContainerStarted","Data":"f10da9b7d3726b48c8b2d0851fbea9c7d6d3477e17b164518bdb6f7d394ce96e"} Feb 23 06:57:56 crc kubenswrapper[4626]: I0223 06:57:56.805734 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-574ch" event={"ID":"afd241f0-61b5-4185-a928-41cf22745048","Type":"ContainerStarted","Data":"7e47ee9e17972ddc049e1f8b9437010bbbf87ed03dd442645dae4a83b6c18112"} Feb 23 06:57:56 crc kubenswrapper[4626]: I0223 06:57:56.808558 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-88448458f-7j4tm" event={"ID":"7f0ca890-098e-4dea-b359-10582f94c059","Type":"ContainerStarted","Data":"000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641"} Feb 23 06:57:56 crc kubenswrapper[4626]: I0223 06:57:56.809514 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.822439 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b918ead-8c70-463f-b938-948436aa4278","Type":"ContainerStarted","Data":"cc99ddfc4477aa58d9b094b0c683de487c8b7002e732a2da9c53bffaa3eb735e"} Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.823100 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b918ead-8c70-463f-b938-948436aa4278","Type":"ContainerStarted","Data":"56ff2e03b8dd227d1a3a95a74fcbcb781715475f0f72cdcb8954483a107dab89"} Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.824760 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-k59cf" event={"ID":"707342b7-0d1b-431f-98bb-99af693f57b2","Type":"ContainerStarted","Data":"015eebc3f598b5ce94981dcb4a2dde686f6e388d933c273ca404c1c687525b24"} Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.827107 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f76a96be-520f-46e8-9e47-4a4d3237359e","Type":"ContainerStarted","Data":"68000dd43cd9938c6d937201c4f09019d427fe43fd66e6ad1b10d7381878b1a4"} Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.830808 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-574ch" event={"ID":"afd241f0-61b5-4185-a928-41cf22745048","Type":"ContainerStarted","Data":"5367a6cacbd93c5fb92d02f7a05767a5e79f02dd90fffd7d8a640168b6ed6f7e"} Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.831301 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.831332 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.833414 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" event={"ID":"64931128-a9a9-43f5-a7ad-6a9d4ba7b642","Type":"ContainerStarted","Data":"dd8cd9f28b39de6e626556ddbf425d982bddda8ec482402092bd612134adb3a7"} Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.834655 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.848690 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-88448458f-7j4tm" podStartSLOduration=6.182380453 podStartE2EDuration="6.84867252s" podCreationTimestamp="2026-02-23 06:57:51 +0000 UTC" firstStartedPulling="2026-02-23 06:57:54.263011513 +0000 UTC m=+1026.602340780" lastFinishedPulling="2026-02-23 06:57:54.929303581 +0000 UTC m=+1027.268632847" observedRunningTime="2026-02-23 06:57:56.828359334 +0000 UTC m=+1029.167688601" watchObservedRunningTime="2026-02-23 06:57:57.84867252 +0000 UTC m=+1030.188001786" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.852204 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=27.680506139 podStartE2EDuration="30.852186946s" podCreationTimestamp="2026-02-23 06:57:27 +0000 UTC" firstStartedPulling="2026-02-23 06:57:53.124920491 +0000 UTC m=+1025.464249756" lastFinishedPulling="2026-02-23 06:57:56.296601298 +0000 UTC m=+1028.635930563" observedRunningTime="2026-02-23 06:57:57.845222614 +0000 UTC m=+1030.184551881" watchObservedRunningTime="2026-02-23 06:57:57.852186946 +0000 UTC m=+1030.191516212" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.868541 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=25.696859936 podStartE2EDuration="28.868517748s" podCreationTimestamp="2026-02-23 06:57:29 +0000 UTC" firstStartedPulling="2026-02-23 06:57:53.126110043 +0000 UTC m=+1025.465439310" lastFinishedPulling="2026-02-23 06:57:56.297767857 +0000 UTC m=+1028.637097122" observedRunningTime="2026-02-23 06:57:57.864331264 +0000 UTC m=+1030.203660530" watchObservedRunningTime="2026-02-23 06:57:57.868517748 +0000 UTC m=+1030.207847013" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.891432 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-k59cf" podStartSLOduration=4.792728989 podStartE2EDuration="6.891402726s" podCreationTimestamp="2026-02-23 06:57:51 +0000 UTC" firstStartedPulling="2026-02-23 06:57:54.26722606 +0000 UTC m=+1026.606555326" lastFinishedPulling="2026-02-23 06:57:56.365899808 +0000 UTC m=+1028.705229063" observedRunningTime="2026-02-23 06:57:57.882135043 +0000 UTC m=+1030.221464309" watchObservedRunningTime="2026-02-23 06:57:57.891402726 +0000 UTC m=+1030.230731992" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.918723 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" podStartSLOduration=6.29023037 podStartE2EDuration="6.918699794s" podCreationTimestamp="2026-02-23 06:57:51 +0000 UTC" firstStartedPulling="2026-02-23 06:57:54.2796531 +0000 UTC m=+1026.618982367" lastFinishedPulling="2026-02-23 06:57:54.908122525 +0000 UTC m=+1027.247451791" observedRunningTime="2026-02-23 06:57:57.916730491 +0000 UTC m=+1030.256059757" watchObservedRunningTime="2026-02-23 06:57:57.918699794 +0000 UTC m=+1030.258029061" Feb 23 06:57:57 crc kubenswrapper[4626]: I0223 06:57:57.934274 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-574ch" podStartSLOduration=17.382957333 podStartE2EDuration="30.934256466s" podCreationTimestamp="2026-02-23 06:57:27 +0000 UTC" firstStartedPulling="2026-02-23 06:57:40.080574234 +0000 UTC m=+1012.419903490" lastFinishedPulling="2026-02-23 06:57:53.631873357 +0000 UTC m=+1025.971202623" observedRunningTime="2026-02-23 06:57:57.93257176 +0000 UTC m=+1030.271901025" watchObservedRunningTime="2026-02-23 06:57:57.934256466 +0000 UTC m=+1030.273585732" Feb 23 06:57:58 crc kubenswrapper[4626]: I0223 06:57:58.329463 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 23 06:57:58 crc kubenswrapper[4626]: I0223 06:57:58.775947 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:58 crc kubenswrapper[4626]: I0223 06:57:58.775990 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 23 06:57:58 crc kubenswrapper[4626]: I0223 06:57:58.843132 4626 generic.go:334] "Generic (PLEG): container finished" podID="3b2ea075-8063-4feb-8e91-3160073129ff" containerID="23d9905416a11ee48901f487e33531b9ae355c13a12e99f42302e6d970ddc099" exitCode=0 Feb 23 06:57:58 crc kubenswrapper[4626]: I0223 06:57:58.843215 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3b2ea075-8063-4feb-8e91-3160073129ff","Type":"ContainerDied","Data":"23d9905416a11ee48901f487e33531b9ae355c13a12e99f42302e6d970ddc099"} Feb 23 06:57:59 crc kubenswrapper[4626]: I0223 06:57:59.853809 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3b2ea075-8063-4feb-8e91-3160073129ff","Type":"ContainerStarted","Data":"f44d0359e7016da66578a340108f6cabc7d3b58d262d1576f445f84eb90cfd81"} Feb 23 06:57:59 crc kubenswrapper[4626]: I0223 06:57:59.873513 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.312912513 podStartE2EDuration="40.873478722s" podCreationTimestamp="2026-02-23 06:57:19 +0000 UTC" firstStartedPulling="2026-02-23 06:57:21.537692472 +0000 UTC m=+993.877021728" lastFinishedPulling="2026-02-23 06:57:53.098258671 +0000 UTC m=+1025.437587937" observedRunningTime="2026-02-23 06:57:59.870230998 +0000 UTC m=+1032.209560265" watchObservedRunningTime="2026-02-23 06:57:59.873478722 +0000 UTC m=+1032.212807989" Feb 23 06:58:00 crc kubenswrapper[4626]: I0223 06:58:00.866329 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03a2b658-d642-449c-be58-94b80484618e","Type":"ContainerStarted","Data":"8537c2aef4447d101e9c1e4d64a812adbd0242408550b8ffb55f83046d11f76d"} Feb 23 06:58:00 crc kubenswrapper[4626]: I0223 06:58:00.932532 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 23 06:58:00 crc kubenswrapper[4626]: I0223 06:58:00.932603 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.329862 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.361353 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.797729 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.807222 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.864246 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.928126 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 23 06:58:01 crc kubenswrapper[4626]: I0223 06:58:01.969635 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.056268 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-88448458f-7j4tm"] Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.056539 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-88448458f-7j4tm" podUID="7f0ca890-098e-4dea-b359-10582f94c059" containerName="dnsmasq-dns" containerID="cri-o://000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641" gracePeriod=10 Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.336672 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.338032 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.350081 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r25s4" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.350272 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.350739 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.350741 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.355192 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.490758 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9668927a-b529-4f44-a093-41260f069e34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.490876 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.491097 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ssjn\" (UniqueName: \"kubernetes.io/projected/9668927a-b529-4f44-a093-41260f069e34-kube-api-access-9ssjn\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.491158 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9668927a-b529-4f44-a093-41260f069e34-scripts\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.491245 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.491278 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9668927a-b529-4f44-a093-41260f069e34-config\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.491361 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593389 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ssjn\" (UniqueName: \"kubernetes.io/projected/9668927a-b529-4f44-a093-41260f069e34-kube-api-access-9ssjn\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593449 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9668927a-b529-4f44-a093-41260f069e34-scripts\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593569 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9668927a-b529-4f44-a093-41260f069e34-config\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593593 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593673 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593759 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9668927a-b529-4f44-a093-41260f069e34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.593821 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.594542 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9668927a-b529-4f44-a093-41260f069e34-scripts\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.594895 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9668927a-b529-4f44-a093-41260f069e34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.595138 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9668927a-b529-4f44-a093-41260f069e34-config\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.595452 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.601318 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.601586 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.603694 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9668927a-b529-4f44-a093-41260f069e34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.611126 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ssjn\" (UniqueName: \"kubernetes.io/projected/9668927a-b529-4f44-a093-41260f069e34-kube-api-access-9ssjn\") pod \"ovn-northd-0\" (UID: \"9668927a-b529-4f44-a093-41260f069e34\") " pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.671443 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.696004 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnms8\" (UniqueName: \"kubernetes.io/projected/7f0ca890-098e-4dea-b359-10582f94c059-kube-api-access-gnms8\") pod \"7f0ca890-098e-4dea-b359-10582f94c059\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.696443 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-config\") pod \"7f0ca890-098e-4dea-b359-10582f94c059\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.696525 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-dns-svc\") pod \"7f0ca890-098e-4dea-b359-10582f94c059\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.696653 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-ovsdbserver-nb\") pod \"7f0ca890-098e-4dea-b359-10582f94c059\" (UID: \"7f0ca890-098e-4dea-b359-10582f94c059\") " Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.703059 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0ca890-098e-4dea-b359-10582f94c059-kube-api-access-gnms8" (OuterVolumeSpecName: "kube-api-access-gnms8") pod "7f0ca890-098e-4dea-b359-10582f94c059" (UID: "7f0ca890-098e-4dea-b359-10582f94c059"). InnerVolumeSpecName "kube-api-access-gnms8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.750172 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f0ca890-098e-4dea-b359-10582f94c059" (UID: "7f0ca890-098e-4dea-b359-10582f94c059"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.753891 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-config" (OuterVolumeSpecName: "config") pod "7f0ca890-098e-4dea-b359-10582f94c059" (UID: "7f0ca890-098e-4dea-b359-10582f94c059"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.753979 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f0ca890-098e-4dea-b359-10582f94c059" (UID: "7f0ca890-098e-4dea-b359-10582f94c059"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.800185 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.800223 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.800235 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f0ca890-098e-4dea-b359-10582f94c059-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.800246 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnms8\" (UniqueName: \"kubernetes.io/projected/7f0ca890-098e-4dea-b359-10582f94c059-kube-api-access-gnms8\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.886304 4626 generic.go:334] "Generic (PLEG): container finished" podID="7f0ca890-098e-4dea-b359-10582f94c059" containerID="000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641" exitCode=0 Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.886397 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88448458f-7j4tm" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.886477 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-88448458f-7j4tm" event={"ID":"7f0ca890-098e-4dea-b359-10582f94c059","Type":"ContainerDied","Data":"000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641"} Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.886537 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-88448458f-7j4tm" event={"ID":"7f0ca890-098e-4dea-b359-10582f94c059","Type":"ContainerDied","Data":"6146c4cfb2d96c95a4587d3299ccc9520a19b17904fbbc95d135f034e5bd1b75"} Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.886561 4626 scope.go:117] "RemoveContainer" containerID="000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.906316 4626 scope.go:117] "RemoveContainer" containerID="6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.931149 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-88448458f-7j4tm"] Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.940346 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-88448458f-7j4tm"] Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.948025 4626 scope.go:117] "RemoveContainer" containerID="000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641" Feb 23 06:58:02 crc kubenswrapper[4626]: E0223 06:58:02.948433 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641\": container with ID starting with 000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641 not found: ID does not exist" containerID="000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.948463 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641"} err="failed to get container status \"000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641\": rpc error: code = NotFound desc = could not find container \"000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641\": container with ID starting with 000a8490adc4281c7df3e66157d43e91eee7150ac96af62205e1ff3c1bd55641 not found: ID does not exist" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.948514 4626 scope.go:117] "RemoveContainer" containerID="6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43" Feb 23 06:58:02 crc kubenswrapper[4626]: E0223 06:58:02.948788 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43\": container with ID starting with 6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43 not found: ID does not exist" containerID="6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43" Feb 23 06:58:02 crc kubenswrapper[4626]: I0223 06:58:02.948820 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43"} err="failed to get container status \"6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43\": rpc error: code = NotFound desc = could not find container \"6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43\": container with ID starting with 6e9c40ccb5892d5a8a3d0f34dbec2ec3798f56531c8a6d49197c243284175c43 not found: ID does not exist" Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.098582 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.896084 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1","Type":"ContainerStarted","Data":"51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03"} Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.899663 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb16cec1-24fc-4504-8968-0c3fb8368f27","Type":"ContainerStarted","Data":"06e385e38283193cdbfcddd982ff856e4477c5930a30d9606b70b08d9fcacc08"} Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.901255 4626 generic.go:334] "Generic (PLEG): container finished" podID="03a2b658-d642-449c-be58-94b80484618e" containerID="8537c2aef4447d101e9c1e4d64a812adbd0242408550b8ffb55f83046d11f76d" exitCode=0 Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.901312 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03a2b658-d642-449c-be58-94b80484618e","Type":"ContainerDied","Data":"8537c2aef4447d101e9c1e4d64a812adbd0242408550b8ffb55f83046d11f76d"} Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.907024 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9668927a-b529-4f44-a093-41260f069e34","Type":"ContainerStarted","Data":"1cbcaf2447f2b95416bc46b7618e84b9bdfe221316476b9902de672f646ef0ce"} Feb 23 06:58:03 crc kubenswrapper[4626]: I0223 06:58:03.990654 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0ca890-098e-4dea-b359-10582f94c059" path="/var/lib/kubelet/pods/7f0ca890-098e-4dea-b359-10582f94c059/volumes" Feb 23 06:58:04 crc kubenswrapper[4626]: I0223 06:58:04.915200 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"03a2b658-d642-449c-be58-94b80484618e","Type":"ContainerStarted","Data":"0c609232d403f39e538145b54f1947330ad818dec23a5f053b63326e73c0eb72"} Feb 23 06:58:04 crc kubenswrapper[4626]: I0223 06:58:04.918393 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9668927a-b529-4f44-a093-41260f069e34","Type":"ContainerStarted","Data":"8b6ca54e3145ee97b39edf4036b57248677fcc123f682b7764241d011d1d35b1"} Feb 23 06:58:04 crc kubenswrapper[4626]: I0223 06:58:04.933878 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371991.920916 podStartE2EDuration="44.933860388s" podCreationTimestamp="2026-02-23 06:57:20 +0000 UTC" firstStartedPulling="2026-02-23 06:57:22.946247109 +0000 UTC m=+995.285576375" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:04.931593875 +0000 UTC m=+1037.270923141" watchObservedRunningTime="2026-02-23 06:58:04.933860388 +0000 UTC m=+1037.273189644" Feb 23 06:58:05 crc kubenswrapper[4626]: I0223 06:58:05.928131 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"38df29c3-b467-498f-9ec1-a83cc91c27ca","Type":"ContainerStarted","Data":"94db684e62f79b308c36d1584a8c6b866210825099e56c0bd070529b0db066ee"} Feb 23 06:58:05 crc kubenswrapper[4626]: I0223 06:58:05.928737 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 23 06:58:05 crc kubenswrapper[4626]: I0223 06:58:05.931462 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9668927a-b529-4f44-a093-41260f069e34","Type":"ContainerStarted","Data":"70846a999689fb222aa14bcdb7d8d45a53131eaeaf6679ddb0a13ba1085bf7e3"} Feb 23 06:58:05 crc kubenswrapper[4626]: I0223 06:58:05.931645 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 23 06:58:05 crc kubenswrapper[4626]: I0223 06:58:05.950926 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.303520137 podStartE2EDuration="43.950911533s" podCreationTimestamp="2026-02-23 06:57:22 +0000 UTC" firstStartedPulling="2026-02-23 06:57:22.990441736 +0000 UTC m=+995.329771002" lastFinishedPulling="2026-02-23 06:58:05.637833132 +0000 UTC m=+1037.977162398" observedRunningTime="2026-02-23 06:58:05.943034812 +0000 UTC m=+1038.282364078" watchObservedRunningTime="2026-02-23 06:58:05.950911533 +0000 UTC m=+1038.290240800" Feb 23 06:58:05 crc kubenswrapper[4626]: I0223 06:58:05.966087 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.582456865 podStartE2EDuration="3.96607282s" podCreationTimestamp="2026-02-23 06:58:02 +0000 UTC" firstStartedPulling="2026-02-23 06:58:03.199723916 +0000 UTC m=+1035.539053182" lastFinishedPulling="2026-02-23 06:58:04.583339871 +0000 UTC m=+1036.922669137" observedRunningTime="2026-02-23 06:58:05.962997682 +0000 UTC m=+1038.302326948" watchObservedRunningTime="2026-02-23 06:58:05.96607282 +0000 UTC m=+1038.305402086" Feb 23 06:58:06 crc kubenswrapper[4626]: I0223 06:58:06.938153 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8ff8af3-00d8-482b-9054-fa9b2f3bc766","Type":"ContainerStarted","Data":"91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d"} Feb 23 06:58:06 crc kubenswrapper[4626]: I0223 06:58:06.969867 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.482149446 podStartE2EDuration="42.969852903s" podCreationTimestamp="2026-02-23 06:57:24 +0000 UTC" firstStartedPulling="2026-02-23 06:57:25.798294704 +0000 UTC m=+998.137623970" lastFinishedPulling="2026-02-23 06:58:06.285998161 +0000 UTC m=+1038.625327427" observedRunningTime="2026-02-23 06:58:06.964543091 +0000 UTC m=+1039.303872357" watchObservedRunningTime="2026-02-23 06:58:06.969852903 +0000 UTC m=+1039.309182169" Feb 23 06:58:07 crc kubenswrapper[4626]: I0223 06:58:07.135671 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 23 06:58:07 crc kubenswrapper[4626]: I0223 06:58:07.199009 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.617682 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nrd48"] Feb 23 06:58:09 crc kubenswrapper[4626]: E0223 06:58:09.618270 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0ca890-098e-4dea-b359-10582f94c059" containerName="init" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.618283 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0ca890-098e-4dea-b359-10582f94c059" containerName="init" Feb 23 06:58:09 crc kubenswrapper[4626]: E0223 06:58:09.618322 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0ca890-098e-4dea-b359-10582f94c059" containerName="dnsmasq-dns" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.618328 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0ca890-098e-4dea-b359-10582f94c059" containerName="dnsmasq-dns" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.618480 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0ca890-098e-4dea-b359-10582f94c059" containerName="dnsmasq-dns" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.618985 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.620939 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.632961 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nrd48"] Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.740414 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6731c2df-92ed-46ec-89e6-4a5acf111120-operator-scripts\") pod \"root-account-create-update-nrd48\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.740562 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnvg9\" (UniqueName: \"kubernetes.io/projected/6731c2df-92ed-46ec-89e6-4a5acf111120-kube-api-access-nnvg9\") pod \"root-account-create-update-nrd48\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.842402 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnvg9\" (UniqueName: \"kubernetes.io/projected/6731c2df-92ed-46ec-89e6-4a5acf111120-kube-api-access-nnvg9\") pod \"root-account-create-update-nrd48\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.842581 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6731c2df-92ed-46ec-89e6-4a5acf111120-operator-scripts\") pod \"root-account-create-update-nrd48\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.843370 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6731c2df-92ed-46ec-89e6-4a5acf111120-operator-scripts\") pod \"root-account-create-update-nrd48\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.861717 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnvg9\" (UniqueName: \"kubernetes.io/projected/6731c2df-92ed-46ec-89e6-4a5acf111120-kube-api-access-nnvg9\") pod \"root-account-create-update-nrd48\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:09 crc kubenswrapper[4626]: I0223 06:58:09.934322 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:10 crc kubenswrapper[4626]: I0223 06:58:10.383398 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nrd48"] Feb 23 06:58:10 crc kubenswrapper[4626]: I0223 06:58:10.984431 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nrd48" event={"ID":"6731c2df-92ed-46ec-89e6-4a5acf111120","Type":"ContainerStarted","Data":"ecebdbb12c1db60f6f757cc7a36564c05ec59ea1a6581c3e6377dccb1afc537e"} Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.293540 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.293946 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.359454 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.597714 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.782232 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-txmcg"] Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.783412 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-txmcg" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.792789 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-txmcg"] Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.893679 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5002-account-create-update-2j59l"] Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.894868 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6ttq\" (UniqueName: \"kubernetes.io/projected/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-kube-api-access-l6ttq\") pod \"glance-db-create-txmcg\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " pod="openstack/glance-db-create-txmcg" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.895072 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-operator-scripts\") pod \"glance-db-create-txmcg\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " pod="openstack/glance-db-create-txmcg" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.895548 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.897447 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.903063 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5002-account-create-update-2j59l"] Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.996916 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6ttq\" (UniqueName: \"kubernetes.io/projected/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-kube-api-access-l6ttq\") pod \"glance-db-create-txmcg\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " pod="openstack/glance-db-create-txmcg" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.997231 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74xc\" (UniqueName: \"kubernetes.io/projected/bbf8821f-f9db-4112-80cb-a85ecbd60c66-kube-api-access-g74xc\") pod \"glance-5002-account-create-update-2j59l\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.997746 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-operator-scripts\") pod \"glance-db-create-txmcg\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " pod="openstack/glance-db-create-txmcg" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.998017 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbf8821f-f9db-4112-80cb-a85ecbd60c66-operator-scripts\") pod \"glance-5002-account-create-update-2j59l\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:12 crc kubenswrapper[4626]: I0223 06:58:12.998875 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-operator-scripts\") pod \"glance-db-create-txmcg\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " pod="openstack/glance-db-create-txmcg" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.020795 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6ttq\" (UniqueName: \"kubernetes.io/projected/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-kube-api-access-l6ttq\") pod \"glance-db-create-txmcg\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " pod="openstack/glance-db-create-txmcg" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.075390 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.099894 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-txmcg" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.100687 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74xc\" (UniqueName: \"kubernetes.io/projected/bbf8821f-f9db-4112-80cb-a85ecbd60c66-kube-api-access-g74xc\") pod \"glance-5002-account-create-update-2j59l\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.100944 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbf8821f-f9db-4112-80cb-a85ecbd60c66-operator-scripts\") pod \"glance-5002-account-create-update-2j59l\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.101669 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbf8821f-f9db-4112-80cb-a85ecbd60c66-operator-scripts\") pod \"glance-5002-account-create-update-2j59l\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.136005 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74xc\" (UniqueName: \"kubernetes.io/projected/bbf8821f-f9db-4112-80cb-a85ecbd60c66-kube-api-access-g74xc\") pod \"glance-5002-account-create-update-2j59l\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.217861 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.438190 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rt5rb"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.439830 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.444949 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rt5rb"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.509113 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqrf\" (UniqueName: \"kubernetes.io/projected/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-kube-api-access-pnqrf\") pod \"keystone-db-create-rt5rb\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.509254 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-operator-scripts\") pod \"keystone-db-create-rt5rb\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.538556 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-291c-account-create-update-6cvgc"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.539742 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.544038 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.562007 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-291c-account-create-update-6cvgc"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.610956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqrf\" (UniqueName: \"kubernetes.io/projected/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-kube-api-access-pnqrf\") pod \"keystone-db-create-rt5rb\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.611034 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqg9j\" (UniqueName: \"kubernetes.io/projected/59fa2739-f9e3-4007-9e1c-8f95cb92713e-kube-api-access-mqg9j\") pod \"keystone-291c-account-create-update-6cvgc\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.611150 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-operator-scripts\") pod \"keystone-db-create-rt5rb\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.611187 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59fa2739-f9e3-4007-9e1c-8f95cb92713e-operator-scripts\") pod \"keystone-291c-account-create-update-6cvgc\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.611980 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-operator-scripts\") pod \"keystone-db-create-rt5rb\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.612890 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-txmcg"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.637448 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqrf\" (UniqueName: \"kubernetes.io/projected/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-kube-api-access-pnqrf\") pod \"keystone-db-create-rt5rb\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.713358 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59fa2739-f9e3-4007-9e1c-8f95cb92713e-operator-scripts\") pod \"keystone-291c-account-create-update-6cvgc\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.713735 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqg9j\" (UniqueName: \"kubernetes.io/projected/59fa2739-f9e3-4007-9e1c-8f95cb92713e-kube-api-access-mqg9j\") pod \"keystone-291c-account-create-update-6cvgc\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.714580 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59fa2739-f9e3-4007-9e1c-8f95cb92713e-operator-scripts\") pod \"keystone-291c-account-create-update-6cvgc\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.746203 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5002-account-create-update-2j59l"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.756020 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqg9j\" (UniqueName: \"kubernetes.io/projected/59fa2739-f9e3-4007-9e1c-8f95cb92713e-kube-api-access-mqg9j\") pod \"keystone-291c-account-create-update-6cvgc\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.781990 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.821223 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vvf5n"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.853186 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.854415 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.887407 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vvf5n"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.896311 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4bdb-account-create-update-2s5v2"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.897709 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.903602 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4bdb-account-create-update-2s5v2"] Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.904138 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.925352 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-operator-scripts\") pod \"placement-4bdb-account-create-update-2s5v2\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.925414 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986e7467-8ed5-4f55-8518-e6c539b02c17-operator-scripts\") pod \"placement-db-create-vvf5n\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.925463 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvwvw\" (UniqueName: \"kubernetes.io/projected/986e7467-8ed5-4f55-8518-e6c539b02c17-kube-api-access-tvwvw\") pod \"placement-db-create-vvf5n\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:13 crc kubenswrapper[4626]: I0223 06:58:13.925538 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphhp\" (UniqueName: \"kubernetes.io/projected/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-kube-api-access-xphhp\") pod \"placement-4bdb-account-create-update-2s5v2\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.028994 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986e7467-8ed5-4f55-8518-e6c539b02c17-operator-scripts\") pod \"placement-db-create-vvf5n\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.029215 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvwvw\" (UniqueName: \"kubernetes.io/projected/986e7467-8ed5-4f55-8518-e6c539b02c17-kube-api-access-tvwvw\") pod \"placement-db-create-vvf5n\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.029261 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphhp\" (UniqueName: \"kubernetes.io/projected/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-kube-api-access-xphhp\") pod \"placement-4bdb-account-create-update-2s5v2\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.029389 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-operator-scripts\") pod \"placement-4bdb-account-create-update-2s5v2\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.031152 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986e7467-8ed5-4f55-8518-e6c539b02c17-operator-scripts\") pod \"placement-db-create-vvf5n\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.032016 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-operator-scripts\") pod \"placement-4bdb-account-create-update-2s5v2\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.034139 4626 generic.go:334] "Generic (PLEG): container finished" podID="6731c2df-92ed-46ec-89e6-4a5acf111120" containerID="335080fdbc50ec94238cbe05a5d58e68c601cf4173ffeb5e5ce58f7cbd592c4a" exitCode=0 Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.034989 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nrd48" event={"ID":"6731c2df-92ed-46ec-89e6-4a5acf111120","Type":"ContainerDied","Data":"335080fdbc50ec94238cbe05a5d58e68c601cf4173ffeb5e5ce58f7cbd592c4a"} Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.052328 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvwvw\" (UniqueName: \"kubernetes.io/projected/986e7467-8ed5-4f55-8518-e6c539b02c17-kube-api-access-tvwvw\") pod \"placement-db-create-vvf5n\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.055042 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-txmcg" event={"ID":"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a","Type":"ContainerStarted","Data":"ff882c8ab5749639b1baa3892d8e22f4647306e1e844fc7420c59c344c3bf9e1"} Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.055073 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-txmcg" event={"ID":"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a","Type":"ContainerStarted","Data":"b1921a45c55d389c30642f09f66433dbe54a344bf4f1b1d26dd64e03a85a0cce"} Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.061097 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphhp\" (UniqueName: \"kubernetes.io/projected/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-kube-api-access-xphhp\") pod \"placement-4bdb-account-create-update-2s5v2\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.061171 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5002-account-create-update-2j59l" event={"ID":"bbf8821f-f9db-4112-80cb-a85ecbd60c66","Type":"ContainerStarted","Data":"16fd182cc9a9de5e274c255821df831d2cfab8de39e03cf5651627676b35bcd7"} Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.075970 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-txmcg" podStartSLOduration=2.075955767 podStartE2EDuration="2.075955767s" podCreationTimestamp="2026-02-23 06:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:14.075852391 +0000 UTC m=+1046.415181648" watchObservedRunningTime="2026-02-23 06:58:14.075955767 +0000 UTC m=+1046.415285033" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.108690 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-5002-account-create-update-2j59l" podStartSLOduration=2.108672493 podStartE2EDuration="2.108672493s" podCreationTimestamp="2026-02-23 06:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:14.098995066 +0000 UTC m=+1046.438324323" watchObservedRunningTime="2026-02-23 06:58:14.108672493 +0000 UTC m=+1046.448001759" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.191103 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.231150 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.379623 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rt5rb"] Feb 23 06:58:14 crc kubenswrapper[4626]: W0223 06:58:14.385131 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0dcb83c_5b75_4b79_ba8f_9ac4464efaf6.slice/crio-6258af909bd4d78c4b015a352b658f38f5208956a692cdf37f1f8893298a7b14 WatchSource:0}: Error finding container 6258af909bd4d78c4b015a352b658f38f5208956a692cdf37f1f8893298a7b14: Status 404 returned error can't find the container with id 6258af909bd4d78c4b015a352b658f38f5208956a692cdf37f1f8893298a7b14 Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.452436 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-291c-account-create-update-6cvgc"] Feb 23 06:58:14 crc kubenswrapper[4626]: W0223 06:58:14.471245 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59fa2739_f9e3_4007_9e1c_8f95cb92713e.slice/crio-21548745a324f434784a1f4538e6b593cb25804b791f1a19b57bf14c03b9f880 WatchSource:0}: Error finding container 21548745a324f434784a1f4538e6b593cb25804b791f1a19b57bf14c03b9f880: Status 404 returned error can't find the container with id 21548745a324f434784a1f4538e6b593cb25804b791f1a19b57bf14c03b9f880 Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.643851 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vvf5n"] Feb 23 06:58:14 crc kubenswrapper[4626]: I0223 06:58:14.754893 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4bdb-account-create-update-2s5v2"] Feb 23 06:58:14 crc kubenswrapper[4626]: W0223 06:58:14.762075 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd351e8f0_0b2c_4e4a_a6ee_3540a45a2ac9.slice/crio-31fe7d6078f1c96698e3ba8922351285d5e3a4c58ce0301f6255807699982758 WatchSource:0}: Error finding container 31fe7d6078f1c96698e3ba8922351285d5e3a4c58ce0301f6255807699982758: Status 404 returned error can't find the container with id 31fe7d6078f1c96698e3ba8922351285d5e3a4c58ce0301f6255807699982758 Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.073479 4626 generic.go:334] "Generic (PLEG): container finished" podID="e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" containerID="ff882c8ab5749639b1baa3892d8e22f4647306e1e844fc7420c59c344c3bf9e1" exitCode=0 Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.074020 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-txmcg" event={"ID":"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a","Type":"ContainerDied","Data":"ff882c8ab5749639b1baa3892d8e22f4647306e1e844fc7420c59c344c3bf9e1"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.075286 4626 generic.go:334] "Generic (PLEG): container finished" podID="b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" containerID="10b027b83b437b6c5981eb0d5984bab81043cd8bf1425df719420e75c22fa87e" exitCode=0 Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.075335 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rt5rb" event={"ID":"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6","Type":"ContainerDied","Data":"10b027b83b437b6c5981eb0d5984bab81043cd8bf1425df719420e75c22fa87e"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.075352 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rt5rb" event={"ID":"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6","Type":"ContainerStarted","Data":"6258af909bd4d78c4b015a352b658f38f5208956a692cdf37f1f8893298a7b14"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.076423 4626 generic.go:334] "Generic (PLEG): container finished" podID="bbf8821f-f9db-4112-80cb-a85ecbd60c66" containerID="1e0962286903e693d402f0eeda854adbaa71b06ddae8f2c468fe8523070e8f39" exitCode=0 Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.076464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5002-account-create-update-2j59l" event={"ID":"bbf8821f-f9db-4112-80cb-a85ecbd60c66","Type":"ContainerDied","Data":"1e0962286903e693d402f0eeda854adbaa71b06ddae8f2c468fe8523070e8f39"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.077434 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4bdb-account-create-update-2s5v2" event={"ID":"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9","Type":"ContainerStarted","Data":"83a1ca0d21c9a868af64d8bf98fc7001036bc86e1482b053dbd1ac3070a989be"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.077461 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4bdb-account-create-update-2s5v2" event={"ID":"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9","Type":"ContainerStarted","Data":"31fe7d6078f1c96698e3ba8922351285d5e3a4c58ce0301f6255807699982758"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.079285 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vvf5n" event={"ID":"986e7467-8ed5-4f55-8518-e6c539b02c17","Type":"ContainerStarted","Data":"90f39ed5761bd663b510d80526abdec8fdc67d692b43aa71a0ac30d46daa28ee"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.079309 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vvf5n" event={"ID":"986e7467-8ed5-4f55-8518-e6c539b02c17","Type":"ContainerStarted","Data":"552f1f7797599a57213536432ee1ea0ba76dd0cbcaa0450afe85435cd47be793"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.080681 4626 generic.go:334] "Generic (PLEG): container finished" podID="59fa2739-f9e3-4007-9e1c-8f95cb92713e" containerID="414cd5241cbb972da64432d6754407471801b969a757625d7f56b7fa2ebd5650" exitCode=0 Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.080859 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-291c-account-create-update-6cvgc" event={"ID":"59fa2739-f9e3-4007-9e1c-8f95cb92713e","Type":"ContainerDied","Data":"414cd5241cbb972da64432d6754407471801b969a757625d7f56b7fa2ebd5650"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.080882 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-291c-account-create-update-6cvgc" event={"ID":"59fa2739-f9e3-4007-9e1c-8f95cb92713e","Type":"ContainerStarted","Data":"21548745a324f434784a1f4538e6b593cb25804b791f1a19b57bf14c03b9f880"} Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.183717 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-4bdb-account-create-update-2s5v2" podStartSLOduration=2.1836992 podStartE2EDuration="2.1836992s" podCreationTimestamp="2026-02-23 06:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:15.174936017 +0000 UTC m=+1047.514265282" watchObservedRunningTime="2026-02-23 06:58:15.1836992 +0000 UTC m=+1047.523028456" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.201611 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.211571 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-687c99d675-v44bm"] Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.213137 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.220264 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.226406 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-vvf5n" podStartSLOduration=2.226386455 podStartE2EDuration="2.226386455s" podCreationTimestamp="2026-02-23 06:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:15.208272021 +0000 UTC m=+1047.547601287" watchObservedRunningTime="2026-02-23 06:58:15.226386455 +0000 UTC m=+1047.565715721" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.237085 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-687c99d675-v44bm"] Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.257522 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-nb\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.257565 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22w2\" (UniqueName: \"kubernetes.io/projected/70d23afb-92d9-44e1-896e-c048ca8fe3d7-kube-api-access-j22w2\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.257584 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-sb\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.257643 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-dns-svc\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.257678 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-config\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.361319 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j22w2\" (UniqueName: \"kubernetes.io/projected/70d23afb-92d9-44e1-896e-c048ca8fe3d7-kube-api-access-j22w2\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.361363 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-sb\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.361438 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-dns-svc\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.361472 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-config\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.361577 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-nb\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.362336 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-nb\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.363049 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-sb\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.367239 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-config\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.376859 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-dns-svc\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.400358 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22w2\" (UniqueName: \"kubernetes.io/projected/70d23afb-92d9-44e1-896e-c048ca8fe3d7-kube-api-access-j22w2\") pod \"dnsmasq-dns-687c99d675-v44bm\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.588882 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.757029 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.782599 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnvg9\" (UniqueName: \"kubernetes.io/projected/6731c2df-92ed-46ec-89e6-4a5acf111120-kube-api-access-nnvg9\") pod \"6731c2df-92ed-46ec-89e6-4a5acf111120\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.782657 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6731c2df-92ed-46ec-89e6-4a5acf111120-operator-scripts\") pod \"6731c2df-92ed-46ec-89e6-4a5acf111120\" (UID: \"6731c2df-92ed-46ec-89e6-4a5acf111120\") " Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.783897 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6731c2df-92ed-46ec-89e6-4a5acf111120-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6731c2df-92ed-46ec-89e6-4a5acf111120" (UID: "6731c2df-92ed-46ec-89e6-4a5acf111120"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.791018 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731c2df-92ed-46ec-89e6-4a5acf111120-kube-api-access-nnvg9" (OuterVolumeSpecName: "kube-api-access-nnvg9") pod "6731c2df-92ed-46ec-89e6-4a5acf111120" (UID: "6731c2df-92ed-46ec-89e6-4a5acf111120"). InnerVolumeSpecName "kube-api-access-nnvg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.885837 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnvg9\" (UniqueName: \"kubernetes.io/projected/6731c2df-92ed-46ec-89e6-4a5acf111120-kube-api-access-nnvg9\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:15 crc kubenswrapper[4626]: I0223 06:58:15.885877 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6731c2df-92ed-46ec-89e6-4a5acf111120-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.094116 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nrd48" event={"ID":"6731c2df-92ed-46ec-89e6-4a5acf111120","Type":"ContainerDied","Data":"ecebdbb12c1db60f6f757cc7a36564c05ec59ea1a6581c3e6377dccb1afc537e"} Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.094197 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecebdbb12c1db60f6f757cc7a36564c05ec59ea1a6581c3e6377dccb1afc537e" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.094154 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nrd48" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.095637 4626 generic.go:334] "Generic (PLEG): container finished" podID="d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" containerID="83a1ca0d21c9a868af64d8bf98fc7001036bc86e1482b053dbd1ac3070a989be" exitCode=0 Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.095756 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4bdb-account-create-update-2s5v2" event={"ID":"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9","Type":"ContainerDied","Data":"83a1ca0d21c9a868af64d8bf98fc7001036bc86e1482b053dbd1ac3070a989be"} Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.096893 4626 generic.go:334] "Generic (PLEG): container finished" podID="986e7467-8ed5-4f55-8518-e6c539b02c17" containerID="90f39ed5761bd663b510d80526abdec8fdc67d692b43aa71a0ac30d46daa28ee" exitCode=0 Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.096953 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vvf5n" event={"ID":"986e7467-8ed5-4f55-8518-e6c539b02c17","Type":"ContainerDied","Data":"90f39ed5761bd663b510d80526abdec8fdc67d692b43aa71a0ac30d46daa28ee"} Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.139773 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-687c99d675-v44bm"] Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.361036 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 23 06:58:16 crc kubenswrapper[4626]: E0223 06:58:16.361613 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6731c2df-92ed-46ec-89e6-4a5acf111120" containerName="mariadb-account-create-update" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.361633 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6731c2df-92ed-46ec-89e6-4a5acf111120" containerName="mariadb-account-create-update" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.361785 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="6731c2df-92ed-46ec-89e6-4a5acf111120" containerName="mariadb-account-create-update" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.373101 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.378390 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.378590 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.379607 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pc59n" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.391679 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.391878 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.498653 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbbq\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-kube-api-access-fkbbq\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.498781 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5d736eeb-711a-4553-96ff-2b0d9741ac28-lock\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.498845 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.498927 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.498976 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5d736eeb-711a-4553-96ff-2b0d9741ac28-cache\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.499030 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d736eeb-711a-4553-96ff-2b0d9741ac28-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.578065 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.600592 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.600692 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.600735 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5d736eeb-711a-4553-96ff-2b0d9741ac28-cache\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.600772 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d736eeb-711a-4553-96ff-2b0d9741ac28-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.600832 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbbq\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-kube-api-access-fkbbq\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.600892 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5d736eeb-711a-4553-96ff-2b0d9741ac28-lock\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.601295 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5d736eeb-711a-4553-96ff-2b0d9741ac28-lock\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.601296 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5d736eeb-711a-4553-96ff-2b0d9741ac28-cache\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.601901 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: E0223 06:58:16.603188 4626 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 06:58:16 crc kubenswrapper[4626]: E0223 06:58:16.603214 4626 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 06:58:16 crc kubenswrapper[4626]: E0223 06:58:16.603265 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift podName:5d736eeb-711a-4553-96ff-2b0d9741ac28 nodeName:}" failed. No retries permitted until 2026-02-23 06:58:17.10324589 +0000 UTC m=+1049.442575147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift") pod "swift-storage-0" (UID: "5d736eeb-711a-4553-96ff-2b0d9741ac28") : configmap "swift-ring-files" not found Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.612776 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d736eeb-711a-4553-96ff-2b0d9741ac28-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.621968 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbbq\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-kube-api-access-fkbbq\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.623391 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.703757 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59fa2739-f9e3-4007-9e1c-8f95cb92713e-operator-scripts\") pod \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.704648 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqg9j\" (UniqueName: \"kubernetes.io/projected/59fa2739-f9e3-4007-9e1c-8f95cb92713e-kube-api-access-mqg9j\") pod \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\" (UID: \"59fa2739-f9e3-4007-9e1c-8f95cb92713e\") " Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.706254 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59fa2739-f9e3-4007-9e1c-8f95cb92713e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59fa2739-f9e3-4007-9e1c-8f95cb92713e" (UID: "59fa2739-f9e3-4007-9e1c-8f95cb92713e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.711855 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59fa2739-f9e3-4007-9e1c-8f95cb92713e-kube-api-access-mqg9j" (OuterVolumeSpecName: "kube-api-access-mqg9j") pod "59fa2739-f9e3-4007-9e1c-8f95cb92713e" (UID: "59fa2739-f9e3-4007-9e1c-8f95cb92713e"). InnerVolumeSpecName "kube-api-access-mqg9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.801302 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.809964 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqg9j\" (UniqueName: \"kubernetes.io/projected/59fa2739-f9e3-4007-9e1c-8f95cb92713e-kube-api-access-mqg9j\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.809996 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59fa2739-f9e3-4007-9e1c-8f95cb92713e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.897249 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-txmcg" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.911678 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbf8821f-f9db-4112-80cb-a85ecbd60c66-operator-scripts\") pod \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.912044 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g74xc\" (UniqueName: \"kubernetes.io/projected/bbf8821f-f9db-4112-80cb-a85ecbd60c66-kube-api-access-g74xc\") pod \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\" (UID: \"bbf8821f-f9db-4112-80cb-a85ecbd60c66\") " Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.913454 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf8821f-f9db-4112-80cb-a85ecbd60c66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbf8821f-f9db-4112-80cb-a85ecbd60c66" (UID: "bbf8821f-f9db-4112-80cb-a85ecbd60c66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.914352 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbf8821f-f9db-4112-80cb-a85ecbd60c66-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.919964 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf8821f-f9db-4112-80cb-a85ecbd60c66-kube-api-access-g74xc" (OuterVolumeSpecName: "kube-api-access-g74xc") pod "bbf8821f-f9db-4112-80cb-a85ecbd60c66" (UID: "bbf8821f-f9db-4112-80cb-a85ecbd60c66"). InnerVolumeSpecName "kube-api-access-g74xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:16 crc kubenswrapper[4626]: I0223 06:58:16.920488 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.015416 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnqrf\" (UniqueName: \"kubernetes.io/projected/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-kube-api-access-pnqrf\") pod \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.015560 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6ttq\" (UniqueName: \"kubernetes.io/projected/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-kube-api-access-l6ttq\") pod \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.015750 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-operator-scripts\") pod \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\" (UID: \"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.015789 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-operator-scripts\") pod \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\" (UID: \"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.016216 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" (UID: "e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.016351 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.016371 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g74xc\" (UniqueName: \"kubernetes.io/projected/bbf8821f-f9db-4112-80cb-a85ecbd60c66-kube-api-access-g74xc\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.018328 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" (UID: "b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.019424 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-kube-api-access-l6ttq" (OuterVolumeSpecName: "kube-api-access-l6ttq") pod "e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" (UID: "e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a"). InnerVolumeSpecName "kube-api-access-l6ttq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.019728 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-kube-api-access-pnqrf" (OuterVolumeSpecName: "kube-api-access-pnqrf") pod "b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" (UID: "b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6"). InnerVolumeSpecName "kube-api-access-pnqrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.104524 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-291c-account-create-update-6cvgc" event={"ID":"59fa2739-f9e3-4007-9e1c-8f95cb92713e","Type":"ContainerDied","Data":"21548745a324f434784a1f4538e6b593cb25804b791f1a19b57bf14c03b9f880"} Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.104574 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21548745a324f434784a1f4538e6b593cb25804b791f1a19b57bf14c03b9f880" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.104675 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-291c-account-create-update-6cvgc" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.106778 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-txmcg" event={"ID":"e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a","Type":"ContainerDied","Data":"b1921a45c55d389c30642f09f66433dbe54a344bf4f1b1d26dd64e03a85a0cce"} Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.106804 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1921a45c55d389c30642f09f66433dbe54a344bf4f1b1d26dd64e03a85a0cce" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.106840 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-txmcg" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.113144 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rt5rb" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.114584 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rt5rb" event={"ID":"b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6","Type":"ContainerDied","Data":"6258af909bd4d78c4b015a352b658f38f5208956a692cdf37f1f8893298a7b14"} Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.114656 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6258af909bd4d78c4b015a352b658f38f5208956a692cdf37f1f8893298a7b14" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.115352 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5002-account-create-update-2j59l" event={"ID":"bbf8821f-f9db-4112-80cb-a85ecbd60c66","Type":"ContainerDied","Data":"16fd182cc9a9de5e274c255821df831d2cfab8de39e03cf5651627676b35bcd7"} Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.115378 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16fd182cc9a9de5e274c255821df831d2cfab8de39e03cf5651627676b35bcd7" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.115380 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5002-account-create-update-2j59l" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.117637 4626 generic.go:334] "Generic (PLEG): container finished" podID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerID="4f960e4e86825d8b344c4b5db81ff620589505fbebf30b2cb10a233cc9ef51af" exitCode=0 Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.118309 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687c99d675-v44bm" event={"ID":"70d23afb-92d9-44e1-896e-c048ca8fe3d7","Type":"ContainerDied","Data":"4f960e4e86825d8b344c4b5db81ff620589505fbebf30b2cb10a233cc9ef51af"} Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.118328 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687c99d675-v44bm" event={"ID":"70d23afb-92d9-44e1-896e-c048ca8fe3d7","Type":"ContainerStarted","Data":"222ecd550da2468a5929182a00798ffa83be762887eaf59bfdfbf4ebd190583c"} Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.120288 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:17 crc kubenswrapper[4626]: E0223 06:58:17.120836 4626 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 06:58:17 crc kubenswrapper[4626]: E0223 06:58:17.120960 4626 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 06:58:17 crc kubenswrapper[4626]: E0223 06:58:17.121227 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift podName:5d736eeb-711a-4553-96ff-2b0d9741ac28 nodeName:}" failed. No retries permitted until 2026-02-23 06:58:18.120977792 +0000 UTC m=+1050.460307057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift") pod "swift-storage-0" (UID: "5d736eeb-711a-4553-96ff-2b0d9741ac28") : configmap "swift-ring-files" not found Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.125956 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnqrf\" (UniqueName: \"kubernetes.io/projected/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-kube-api-access-pnqrf\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.126004 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6ttq\" (UniqueName: \"kubernetes.io/projected/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a-kube-api-access-l6ttq\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.126019 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.517934 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.564988 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.638395 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986e7467-8ed5-4f55-8518-e6c539b02c17-operator-scripts\") pod \"986e7467-8ed5-4f55-8518-e6c539b02c17\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.638700 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvwvw\" (UniqueName: \"kubernetes.io/projected/986e7467-8ed5-4f55-8518-e6c539b02c17-kube-api-access-tvwvw\") pod \"986e7467-8ed5-4f55-8518-e6c539b02c17\" (UID: \"986e7467-8ed5-4f55-8518-e6c539b02c17\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.639028 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/986e7467-8ed5-4f55-8518-e6c539b02c17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "986e7467-8ed5-4f55-8518-e6c539b02c17" (UID: "986e7467-8ed5-4f55-8518-e6c539b02c17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.639217 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986e7467-8ed5-4f55-8518-e6c539b02c17-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.642130 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986e7467-8ed5-4f55-8518-e6c539b02c17-kube-api-access-tvwvw" (OuterVolumeSpecName: "kube-api-access-tvwvw") pod "986e7467-8ed5-4f55-8518-e6c539b02c17" (UID: "986e7467-8ed5-4f55-8518-e6c539b02c17"). InnerVolumeSpecName "kube-api-access-tvwvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.740586 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-operator-scripts\") pod \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.740905 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xphhp\" (UniqueName: \"kubernetes.io/projected/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-kube-api-access-xphhp\") pod \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\" (UID: \"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9\") " Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.741061 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" (UID: "d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.741795 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.741822 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvwvw\" (UniqueName: \"kubernetes.io/projected/986e7467-8ed5-4f55-8518-e6c539b02c17-kube-api-access-tvwvw\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.745095 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-kube-api-access-xphhp" (OuterVolumeSpecName: "kube-api-access-xphhp") pod "d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" (UID: "d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9"). InnerVolumeSpecName "kube-api-access-xphhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:17 crc kubenswrapper[4626]: I0223 06:58:17.843270 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xphhp\" (UniqueName: \"kubernetes.io/projected/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9-kube-api-access-xphhp\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.087068 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9lq4g"] Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.087946 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986e7467-8ed5-4f55-8518-e6c539b02c17" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.087966 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="986e7467-8ed5-4f55-8518-e6c539b02c17" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.087980 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.087986 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.088001 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088006 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.088025 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088030 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.088044 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59fa2739-f9e3-4007-9e1c-8f95cb92713e" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088051 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="59fa2739-f9e3-4007-9e1c-8f95cb92713e" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.088063 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf8821f-f9db-4112-80cb-a85ecbd60c66" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088068 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf8821f-f9db-4112-80cb-a85ecbd60c66" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088255 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088265 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="986e7467-8ed5-4f55-8518-e6c539b02c17" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088273 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088281 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf8821f-f9db-4112-80cb-a85ecbd60c66" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088289 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="59fa2739-f9e3-4007-9e1c-8f95cb92713e" containerName="mariadb-account-create-update" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.088299 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" containerName="mariadb-database-create" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.089207 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.093080 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.093356 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zv7dz" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.114686 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9lq4g"] Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.135317 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687c99d675-v44bm" event={"ID":"70d23afb-92d9-44e1-896e-c048ca8fe3d7","Type":"ContainerStarted","Data":"df891bf3b7a98e443ae0db320f1a88d8910f36d0a315cf9b4ff41dc3dde7ab7c"} Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.135654 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.139447 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4bdb-account-create-update-2s5v2" event={"ID":"d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9","Type":"ContainerDied","Data":"31fe7d6078f1c96698e3ba8922351285d5e3a4c58ce0301f6255807699982758"} Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.139487 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31fe7d6078f1c96698e3ba8922351285d5e3a4c58ce0301f6255807699982758" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.139466 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4bdb-account-create-update-2s5v2" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.142071 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vvf5n" event={"ID":"986e7467-8ed5-4f55-8518-e6c539b02c17","Type":"ContainerDied","Data":"552f1f7797599a57213536432ee1ea0ba76dd0cbcaa0450afe85435cd47be793"} Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.142122 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552f1f7797599a57213536432ee1ea0ba76dd0cbcaa0450afe85435cd47be793" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.142182 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vvf5n" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.147871 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.149488 4626 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.149747 4626 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 06:58:18 crc kubenswrapper[4626]: E0223 06:58:18.149827 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift podName:5d736eeb-711a-4553-96ff-2b0d9741ac28 nodeName:}" failed. No retries permitted until 2026-02-23 06:58:20.149801213 +0000 UTC m=+1052.489130478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift") pod "swift-storage-0" (UID: "5d736eeb-711a-4553-96ff-2b0d9741ac28") : configmap "swift-ring-files" not found Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.162765 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-687c99d675-v44bm" podStartSLOduration=3.16275185 podStartE2EDuration="3.16275185s" podCreationTimestamp="2026-02-23 06:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:18.157940298 +0000 UTC m=+1050.497269565" watchObservedRunningTime="2026-02-23 06:58:18.16275185 +0000 UTC m=+1050.502081117" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.249713 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qsz\" (UniqueName: \"kubernetes.io/projected/421f011a-6b43-4b9d-9fa7-c293dc581234-kube-api-access-m6qsz\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.249861 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-config-data\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.249912 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-combined-ca-bundle\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.249949 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-db-sync-config-data\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.352347 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qsz\" (UniqueName: \"kubernetes.io/projected/421f011a-6b43-4b9d-9fa7-c293dc581234-kube-api-access-m6qsz\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.352618 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-config-data\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.352689 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-combined-ca-bundle\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.352750 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-db-sync-config-data\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.359373 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-db-sync-config-data\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.362720 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-combined-ca-bundle\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.363584 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-config-data\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.365910 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qsz\" (UniqueName: \"kubernetes.io/projected/421f011a-6b43-4b9d-9fa7-c293dc581234-kube-api-access-m6qsz\") pod \"glance-db-sync-9lq4g\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:18 crc kubenswrapper[4626]: I0223 06:58:18.407639 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:19 crc kubenswrapper[4626]: I0223 06:58:19.175868 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9lq4g"] Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.160011 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9lq4g" event={"ID":"421f011a-6b43-4b9d-9fa7-c293dc581234","Type":"ContainerStarted","Data":"ab95499c2c33070818f15182d30ef04bf67d49ceb49da5e98dafd91229b7231c"} Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.205159 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:20 crc kubenswrapper[4626]: E0223 06:58:20.205574 4626 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 06:58:20 crc kubenswrapper[4626]: E0223 06:58:20.205591 4626 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 06:58:20 crc kubenswrapper[4626]: E0223 06:58:20.205705 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift podName:5d736eeb-711a-4553-96ff-2b0d9741ac28 nodeName:}" failed. No retries permitted until 2026-02-23 06:58:24.205668851 +0000 UTC m=+1056.544998107 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift") pod "swift-storage-0" (UID: "5d736eeb-711a-4553-96ff-2b0d9741ac28") : configmap "swift-ring-files" not found Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.346051 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dkrrn"] Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.347371 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.350437 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.351680 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.351813 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.366655 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dkrrn"] Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.412914 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-ring-data-devices\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.412957 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-scripts\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.413029 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-combined-ca-bundle\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.413078 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-swiftconf\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.413122 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f4f8444-d09b-4213-be5c-585c699d29ae-etc-swift\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.413145 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5pc\" (UniqueName: \"kubernetes.io/projected/0f4f8444-d09b-4213-be5c-585c699d29ae-kube-api-access-gs5pc\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.413175 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-dispersionconf\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516206 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-ring-data-devices\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516256 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-scripts\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516385 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-combined-ca-bundle\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516457 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-swiftconf\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516538 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f4f8444-d09b-4213-be5c-585c699d29ae-etc-swift\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516574 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5pc\" (UniqueName: \"kubernetes.io/projected/0f4f8444-d09b-4213-be5c-585c699d29ae-kube-api-access-gs5pc\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.516616 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-dispersionconf\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.518164 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-ring-data-devices\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.519528 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-scripts\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.520229 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f4f8444-d09b-4213-be5c-585c699d29ae-etc-swift\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.528441 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-combined-ca-bundle\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.528908 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-dispersionconf\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.538560 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5pc\" (UniqueName: \"kubernetes.io/projected/0f4f8444-d09b-4213-be5c-585c699d29ae-kube-api-access-gs5pc\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.544812 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-swiftconf\") pod \"swift-ring-rebalance-dkrrn\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.663729 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.944126 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nrd48"] Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.948114 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nrd48"] Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.980709 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6k9g5"] Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.982407 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.985939 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 23 06:58:20 crc kubenswrapper[4626]: I0223 06:58:20.999437 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6k9g5"] Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.026576 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27lwc\" (UniqueName: \"kubernetes.io/projected/30ff9fef-0642-4436-ac2d-d9a8b26f950e-kube-api-access-27lwc\") pod \"root-account-create-update-6k9g5\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.027019 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ff9fef-0642-4436-ac2d-d9a8b26f950e-operator-scripts\") pod \"root-account-create-update-6k9g5\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.111115 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dkrrn"] Feb 23 06:58:21 crc kubenswrapper[4626]: W0223 06:58:21.111997 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f4f8444_d09b_4213_be5c_585c699d29ae.slice/crio-4d0b78fe795f266f6c90bf0d31d7a0375cffe6e15d99b6fa655ec016446c630f WatchSource:0}: Error finding container 4d0b78fe795f266f6c90bf0d31d7a0375cffe6e15d99b6fa655ec016446c630f: Status 404 returned error can't find the container with id 4d0b78fe795f266f6c90bf0d31d7a0375cffe6e15d99b6fa655ec016446c630f Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.128726 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ff9fef-0642-4436-ac2d-d9a8b26f950e-operator-scripts\") pod \"root-account-create-update-6k9g5\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.128874 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27lwc\" (UniqueName: \"kubernetes.io/projected/30ff9fef-0642-4436-ac2d-d9a8b26f950e-kube-api-access-27lwc\") pod \"root-account-create-update-6k9g5\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.130844 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ff9fef-0642-4436-ac2d-d9a8b26f950e-operator-scripts\") pod \"root-account-create-update-6k9g5\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.150241 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27lwc\" (UniqueName: \"kubernetes.io/projected/30ff9fef-0642-4436-ac2d-d9a8b26f950e-kube-api-access-27lwc\") pod \"root-account-create-update-6k9g5\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.173264 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dkrrn" event={"ID":"0f4f8444-d09b-4213-be5c-585c699d29ae","Type":"ContainerStarted","Data":"4d0b78fe795f266f6c90bf0d31d7a0375cffe6e15d99b6fa655ec016446c630f"} Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.303638 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.709262 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6k9g5"] Feb 23 06:58:21 crc kubenswrapper[4626]: I0223 06:58:21.998081 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731c2df-92ed-46ec-89e6-4a5acf111120" path="/var/lib/kubelet/pods/6731c2df-92ed-46ec-89e6-4a5acf111120/volumes" Feb 23 06:58:22 crc kubenswrapper[4626]: I0223 06:58:22.184606 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6k9g5" event={"ID":"30ff9fef-0642-4436-ac2d-d9a8b26f950e","Type":"ContainerStarted","Data":"98c075d0d528adb9f1f6bc5433206ddd380210f877152227af5922736a573e3b"} Feb 23 06:58:22 crc kubenswrapper[4626]: I0223 06:58:22.184658 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6k9g5" event={"ID":"30ff9fef-0642-4436-ac2d-d9a8b26f950e","Type":"ContainerStarted","Data":"fc174ebd61e4cf7ebf6b656416a4faf5bbe5ed7eb7f9205ac142d08a41e38370"} Feb 23 06:58:22 crc kubenswrapper[4626]: I0223 06:58:22.738644 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 23 06:58:22 crc kubenswrapper[4626]: I0223 06:58:22.775028 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-6k9g5" podStartSLOduration=2.775005363 podStartE2EDuration="2.775005363s" podCreationTimestamp="2026-02-23 06:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:22.207627756 +0000 UTC m=+1054.546957022" watchObservedRunningTime="2026-02-23 06:58:22.775005363 +0000 UTC m=+1055.114334628" Feb 23 06:58:23 crc kubenswrapper[4626]: I0223 06:58:23.197928 4626 generic.go:334] "Generic (PLEG): container finished" podID="30ff9fef-0642-4436-ac2d-d9a8b26f950e" containerID="98c075d0d528adb9f1f6bc5433206ddd380210f877152227af5922736a573e3b" exitCode=0 Feb 23 06:58:23 crc kubenswrapper[4626]: I0223 06:58:23.197977 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6k9g5" event={"ID":"30ff9fef-0642-4436-ac2d-d9a8b26f950e","Type":"ContainerDied","Data":"98c075d0d528adb9f1f6bc5433206ddd380210f877152227af5922736a573e3b"} Feb 23 06:58:24 crc kubenswrapper[4626]: I0223 06:58:24.307033 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:24 crc kubenswrapper[4626]: E0223 06:58:24.307299 4626 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 06:58:24 crc kubenswrapper[4626]: E0223 06:58:24.307435 4626 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 06:58:24 crc kubenswrapper[4626]: E0223 06:58:24.307540 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift podName:5d736eeb-711a-4553-96ff-2b0d9741ac28 nodeName:}" failed. No retries permitted until 2026-02-23 06:58:32.307511966 +0000 UTC m=+1064.646841223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift") pod "swift-storage-0" (UID: "5d736eeb-711a-4553-96ff-2b0d9741ac28") : configmap "swift-ring-files" not found Feb 23 06:58:25 crc kubenswrapper[4626]: I0223 06:58:25.591679 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:25 crc kubenswrapper[4626]: I0223 06:58:25.682844 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8474bb99-4mn6h"] Feb 23 06:58:25 crc kubenswrapper[4626]: I0223 06:58:25.683154 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="dnsmasq-dns" containerID="cri-o://dd8cd9f28b39de6e626556ddbf425d982bddda8ec482402092bd612134adb3a7" gracePeriod=10 Feb 23 06:58:26 crc kubenswrapper[4626]: I0223 06:58:26.230375 4626 generic.go:334] "Generic (PLEG): container finished" podID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerID="dd8cd9f28b39de6e626556ddbf425d982bddda8ec482402092bd612134adb3a7" exitCode=0 Feb 23 06:58:26 crc kubenswrapper[4626]: I0223 06:58:26.230454 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" event={"ID":"64931128-a9a9-43f5-a7ad-6a9d4ba7b642","Type":"ContainerDied","Data":"dd8cd9f28b39de6e626556ddbf425d982bddda8ec482402092bd612134adb3a7"} Feb 23 06:58:26 crc kubenswrapper[4626]: I0223 06:58:26.969292 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.198955 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.259033 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6k9g5" event={"ID":"30ff9fef-0642-4436-ac2d-d9a8b26f950e","Type":"ContainerDied","Data":"fc174ebd61e4cf7ebf6b656416a4faf5bbe5ed7eb7f9205ac142d08a41e38370"} Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.259237 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc174ebd61e4cf7ebf6b656416a4faf5bbe5ed7eb7f9205ac142d08a41e38370" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.259377 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6k9g5" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.381982 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ff9fef-0642-4436-ac2d-d9a8b26f950e-operator-scripts\") pod \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.382077 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27lwc\" (UniqueName: \"kubernetes.io/projected/30ff9fef-0642-4436-ac2d-d9a8b26f950e-kube-api-access-27lwc\") pod \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\" (UID: \"30ff9fef-0642-4436-ac2d-d9a8b26f950e\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.403283 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ff9fef-0642-4436-ac2d-d9a8b26f950e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30ff9fef-0642-4436-ac2d-d9a8b26f950e" (UID: "30ff9fef-0642-4436-ac2d-d9a8b26f950e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.407115 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ff9fef-0642-4436-ac2d-d9a8b26f950e-kube-api-access-27lwc" (OuterVolumeSpecName: "kube-api-access-27lwc") pod "30ff9fef-0642-4436-ac2d-d9a8b26f950e" (UID: "30ff9fef-0642-4436-ac2d-d9a8b26f950e"). InnerVolumeSpecName "kube-api-access-27lwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.485208 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ff9fef-0642-4436-ac2d-d9a8b26f950e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.485428 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27lwc\" (UniqueName: \"kubernetes.io/projected/30ff9fef-0642-4436-ac2d-d9a8b26f950e-kube-api-access-27lwc\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.527625 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.691140 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-config\") pod \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.691228 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-sb\") pod \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.691339 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-dns-svc\") pod \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.691420 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-nb\") pod \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.691687 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njfwm\" (UniqueName: \"kubernetes.io/projected/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-kube-api-access-njfwm\") pod \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\" (UID: \"64931128-a9a9-43f5-a7ad-6a9d4ba7b642\") " Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.699731 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-kube-api-access-njfwm" (OuterVolumeSpecName: "kube-api-access-njfwm") pod "64931128-a9a9-43f5-a7ad-6a9d4ba7b642" (UID: "64931128-a9a9-43f5-a7ad-6a9d4ba7b642"). InnerVolumeSpecName "kube-api-access-njfwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.725184 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-config" (OuterVolumeSpecName: "config") pod "64931128-a9a9-43f5-a7ad-6a9d4ba7b642" (UID: "64931128-a9a9-43f5-a7ad-6a9d4ba7b642"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.732108 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64931128-a9a9-43f5-a7ad-6a9d4ba7b642" (UID: "64931128-a9a9-43f5-a7ad-6a9d4ba7b642"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.733201 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64931128-a9a9-43f5-a7ad-6a9d4ba7b642" (UID: "64931128-a9a9-43f5-a7ad-6a9d4ba7b642"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.743274 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64931128-a9a9-43f5-a7ad-6a9d4ba7b642" (UID: "64931128-a9a9-43f5-a7ad-6a9d4ba7b642"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.795931 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.795981 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njfwm\" (UniqueName: \"kubernetes.io/projected/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-kube-api-access-njfwm\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.795996 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.796006 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:27 crc kubenswrapper[4626]: I0223 06:58:27.796017 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64931128-a9a9-43f5-a7ad-6a9d4ba7b642-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.121272 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9j9vm" podUID="61db3f96-4a68-44bd-82ff-076ba32d9066" containerName="ovn-controller" probeResult="failure" output=< Feb 23 06:58:28 crc kubenswrapper[4626]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 06:58:28 crc kubenswrapper[4626]: > Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.276452 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" event={"ID":"64931128-a9a9-43f5-a7ad-6a9d4ba7b642","Type":"ContainerDied","Data":"14287deeda76325826023952347173d4cef16fdf3e87c4f3b6f60d6b89370a91"} Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.276523 4626 scope.go:117] "RemoveContainer" containerID="dd8cd9f28b39de6e626556ddbf425d982bddda8ec482402092bd612134adb3a7" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.276673 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8474bb99-4mn6h" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.281683 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dkrrn" event={"ID":"0f4f8444-d09b-4213-be5c-585c699d29ae","Type":"ContainerStarted","Data":"b62ac0da43cea9c2eda0d50941e5d29141e52801f625b7abfa247f866078c574"} Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.300988 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8474bb99-4mn6h"] Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.309279 4626 scope.go:117] "RemoveContainer" containerID="c441b6a0f9ddcf72814a40da14be91ba4883f0cb25ab8d6595e9ae3f23b73f6c" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.311381 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8474bb99-4mn6h"] Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.312982 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.349683 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dkrrn" podStartSLOduration=2.253616015 podStartE2EDuration="8.349671028s" podCreationTimestamp="2026-02-23 06:58:20 +0000 UTC" firstStartedPulling="2026-02-23 06:58:21.116597876 +0000 UTC m=+1053.455927131" lastFinishedPulling="2026-02-23 06:58:27.212652879 +0000 UTC m=+1059.551982144" observedRunningTime="2026-02-23 06:58:28.340818116 +0000 UTC m=+1060.680147382" watchObservedRunningTime="2026-02-23 06:58:28.349671028 +0000 UTC m=+1060.689000294" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.350168 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-574ch" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.599042 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9j9vm-config-mjbbk"] Feb 23 06:58:28 crc kubenswrapper[4626]: E0223 06:58:28.599482 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="dnsmasq-dns" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.599521 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="dnsmasq-dns" Feb 23 06:58:28 crc kubenswrapper[4626]: E0223 06:58:28.599532 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ff9fef-0642-4436-ac2d-d9a8b26f950e" containerName="mariadb-account-create-update" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.599541 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ff9fef-0642-4436-ac2d-d9a8b26f950e" containerName="mariadb-account-create-update" Feb 23 06:58:28 crc kubenswrapper[4626]: E0223 06:58:28.599586 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="init" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.599593 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="init" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.599794 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ff9fef-0642-4436-ac2d-d9a8b26f950e" containerName="mariadb-account-create-update" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.599824 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" containerName="dnsmasq-dns" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.600407 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.605075 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.617719 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-additional-scripts\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.617886 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.617994 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-scripts\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.618365 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-log-ovn\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.618607 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run-ovn\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.618949 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcmm\" (UniqueName: \"kubernetes.io/projected/190b58b0-5d43-4753-a6e8-05281c4fb7fb-kube-api-access-kpcmm\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.659582 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9j9vm-config-mjbbk"] Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724265 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-additional-scripts\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724334 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724365 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-scripts\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724532 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-log-ovn\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724573 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run-ovn\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724606 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcmm\" (UniqueName: \"kubernetes.io/projected/190b58b0-5d43-4753-a6e8-05281c4fb7fb-kube-api-access-kpcmm\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724768 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724863 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-log-ovn\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.724977 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run-ovn\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.725003 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-additional-scripts\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.727971 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-scripts\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.778119 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcmm\" (UniqueName: \"kubernetes.io/projected/190b58b0-5d43-4753-a6e8-05281c4fb7fb-kube-api-access-kpcmm\") pod \"ovn-controller-9j9vm-config-mjbbk\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:28 crc kubenswrapper[4626]: I0223 06:58:28.918980 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:29 crc kubenswrapper[4626]: I0223 06:58:29.449825 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9j9vm-config-mjbbk"] Feb 23 06:58:29 crc kubenswrapper[4626]: I0223 06:58:29.995630 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64931128-a9a9-43f5-a7ad-6a9d4ba7b642" path="/var/lib/kubelet/pods/64931128-a9a9-43f5-a7ad-6a9d4ba7b642/volumes" Feb 23 06:58:32 crc kubenswrapper[4626]: I0223 06:58:32.311623 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:32 crc kubenswrapper[4626]: E0223 06:58:32.311997 4626 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 06:58:32 crc kubenswrapper[4626]: E0223 06:58:32.312017 4626 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 06:58:32 crc kubenswrapper[4626]: E0223 06:58:32.312072 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift podName:5d736eeb-711a-4553-96ff-2b0d9741ac28 nodeName:}" failed. No retries permitted until 2026-02-23 06:58:48.312055467 +0000 UTC m=+1080.651384733 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift") pod "swift-storage-0" (UID: "5d736eeb-711a-4553-96ff-2b0d9741ac28") : configmap "swift-ring-files" not found Feb 23 06:58:33 crc kubenswrapper[4626]: I0223 06:58:33.117656 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9j9vm" podUID="61db3f96-4a68-44bd-82ff-076ba32d9066" containerName="ovn-controller" probeResult="failure" output=< Feb 23 06:58:33 crc kubenswrapper[4626]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 06:58:33 crc kubenswrapper[4626]: > Feb 23 06:58:34 crc kubenswrapper[4626]: I0223 06:58:34.360720 4626 generic.go:334] "Generic (PLEG): container finished" podID="0f4f8444-d09b-4213-be5c-585c699d29ae" containerID="b62ac0da43cea9c2eda0d50941e5d29141e52801f625b7abfa247f866078c574" exitCode=0 Feb 23 06:58:34 crc kubenswrapper[4626]: I0223 06:58:34.360829 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dkrrn" event={"ID":"0f4f8444-d09b-4213-be5c-585c699d29ae","Type":"ContainerDied","Data":"b62ac0da43cea9c2eda0d50941e5d29141e52801f625b7abfa247f866078c574"} Feb 23 06:58:35 crc kubenswrapper[4626]: I0223 06:58:35.370823 4626 generic.go:334] "Generic (PLEG): container finished" podID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerID="51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03" exitCode=0 Feb 23 06:58:35 crc kubenswrapper[4626]: I0223 06:58:35.370938 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1","Type":"ContainerDied","Data":"51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03"} Feb 23 06:58:35 crc kubenswrapper[4626]: I0223 06:58:35.376346 4626 generic.go:334] "Generic (PLEG): container finished" podID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerID="06e385e38283193cdbfcddd982ff856e4477c5930a30d9606b70b08d9fcacc08" exitCode=0 Feb 23 06:58:35 crc kubenswrapper[4626]: I0223 06:58:35.377174 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb16cec1-24fc-4504-8968-0c3fb8368f27","Type":"ContainerDied","Data":"06e385e38283193cdbfcddd982ff856e4477c5930a30d9606b70b08d9fcacc08"} Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.389001 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9j9vm-config-mjbbk" event={"ID":"190b58b0-5d43-4753-a6e8-05281c4fb7fb","Type":"ContainerStarted","Data":"e3dd2f5065afc751e87ad1fdb029ad09a7160a31966b6435d8666b00cd8afb18"} Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.391405 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dkrrn" event={"ID":"0f4f8444-d09b-4213-be5c-585c699d29ae","Type":"ContainerDied","Data":"4d0b78fe795f266f6c90bf0d31d7a0375cffe6e15d99b6fa655ec016446c630f"} Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.391442 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0b78fe795f266f6c90bf0d31d7a0375cffe6e15d99b6fa655ec016446c630f" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.398593 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490086 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-swiftconf\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490219 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-scripts\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490247 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f4f8444-d09b-4213-be5c-585c699d29ae-etc-swift\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490324 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-ring-data-devices\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490369 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5pc\" (UniqueName: \"kubernetes.io/projected/0f4f8444-d09b-4213-be5c-585c699d29ae-kube-api-access-gs5pc\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490402 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-dispersionconf\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.490445 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-combined-ca-bundle\") pod \"0f4f8444-d09b-4213-be5c-585c699d29ae\" (UID: \"0f4f8444-d09b-4213-be5c-585c699d29ae\") " Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.491375 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.491995 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f4f8444-d09b-4213-be5c-585c699d29ae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.495735 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f4f8444-d09b-4213-be5c-585c699d29ae-kube-api-access-gs5pc" (OuterVolumeSpecName: "kube-api-access-gs5pc") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "kube-api-access-gs5pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.500450 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.505131 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-scripts" (OuterVolumeSpecName: "scripts") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.514533 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.519465 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0f4f8444-d09b-4213-be5c-585c699d29ae" (UID: "0f4f8444-d09b-4213-be5c-585c699d29ae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593416 4626 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593798 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593831 4626 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f4f8444-d09b-4213-be5c-585c699d29ae-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593843 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593854 4626 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f4f8444-d09b-4213-be5c-585c699d29ae-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593864 4626 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f4f8444-d09b-4213-be5c-585c699d29ae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:36 crc kubenswrapper[4626]: I0223 06:58:36.593874 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5pc\" (UniqueName: \"kubernetes.io/projected/0f4f8444-d09b-4213-be5c-585c699d29ae-kube-api-access-gs5pc\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.400682 4626 generic.go:334] "Generic (PLEG): container finished" podID="190b58b0-5d43-4753-a6e8-05281c4fb7fb" containerID="217bf194e4e6f031523cd713c7345593b7735ac7dafeaf1a492d3bf1998076dd" exitCode=0 Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.401086 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9j9vm-config-mjbbk" event={"ID":"190b58b0-5d43-4753-a6e8-05281c4fb7fb","Type":"ContainerDied","Data":"217bf194e4e6f031523cd713c7345593b7735ac7dafeaf1a492d3bf1998076dd"} Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.403151 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1","Type":"ContainerStarted","Data":"66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039"} Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.404040 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.405305 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9lq4g" event={"ID":"421f011a-6b43-4b9d-9fa7-c293dc581234","Type":"ContainerStarted","Data":"bc0424fd284b479b719c4557b4b1273418ef13b422ddf1af6cc32fe49ff9e0c5"} Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.410934 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dkrrn" Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.415324 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb16cec1-24fc-4504-8968-0c3fb8368f27","Type":"ContainerStarted","Data":"1e469c0ada8f0d6a21352ad236a3ffa6bfae564238ce58c5b5526e59a3c383d8"} Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.415933 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.467465 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9lq4g" podStartSLOduration=2.334810484 podStartE2EDuration="19.467446731s" podCreationTimestamp="2026-02-23 06:58:18 +0000 UTC" firstStartedPulling="2026-02-23 06:58:19.173075381 +0000 UTC m=+1051.512404647" lastFinishedPulling="2026-02-23 06:58:36.305711628 +0000 UTC m=+1068.645040894" observedRunningTime="2026-02-23 06:58:37.465894104 +0000 UTC m=+1069.805223370" watchObservedRunningTime="2026-02-23 06:58:37.467446731 +0000 UTC m=+1069.806775998" Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.495056 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.976101408 podStartE2EDuration="1m19.495042513s" podCreationTimestamp="2026-02-23 06:57:18 +0000 UTC" firstStartedPulling="2026-02-23 06:57:20.127463071 +0000 UTC m=+992.466792336" lastFinishedPulling="2026-02-23 06:58:02.646404176 +0000 UTC m=+1034.985733441" observedRunningTime="2026-02-23 06:58:37.487894806 +0000 UTC m=+1069.827224072" watchObservedRunningTime="2026-02-23 06:58:37.495042513 +0000 UTC m=+1069.834371779" Feb 23 06:58:37 crc kubenswrapper[4626]: I0223 06:58:37.512436 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371957.342358 podStartE2EDuration="1m19.512417573s" podCreationTimestamp="2026-02-23 06:57:18 +0000 UTC" firstStartedPulling="2026-02-23 06:57:20.306708052 +0000 UTC m=+992.646037318" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:37.505814112 +0000 UTC m=+1069.845143368" watchObservedRunningTime="2026-02-23 06:58:37.512417573 +0000 UTC m=+1069.851746839" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.128478 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9j9vm" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.745516 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.837943 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-scripts\") pod \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.838187 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run\") pod \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.838338 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run-ovn\") pod \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.838359 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-log-ovn\") pod \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.838418 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-additional-scripts\") pod \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.838455 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcmm\" (UniqueName: \"kubernetes.io/projected/190b58b0-5d43-4753-a6e8-05281c4fb7fb-kube-api-access-kpcmm\") pod \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\" (UID: \"190b58b0-5d43-4753-a6e8-05281c4fb7fb\") " Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.841620 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "190b58b0-5d43-4753-a6e8-05281c4fb7fb" (UID: "190b58b0-5d43-4753-a6e8-05281c4fb7fb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.842655 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run" (OuterVolumeSpecName: "var-run") pod "190b58b0-5d43-4753-a6e8-05281c4fb7fb" (UID: "190b58b0-5d43-4753-a6e8-05281c4fb7fb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.842657 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "190b58b0-5d43-4753-a6e8-05281c4fb7fb" (UID: "190b58b0-5d43-4753-a6e8-05281c4fb7fb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.842872 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-scripts" (OuterVolumeSpecName: "scripts") pod "190b58b0-5d43-4753-a6e8-05281c4fb7fb" (UID: "190b58b0-5d43-4753-a6e8-05281c4fb7fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.843205 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "190b58b0-5d43-4753-a6e8-05281c4fb7fb" (UID: "190b58b0-5d43-4753-a6e8-05281c4fb7fb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.859848 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190b58b0-5d43-4753-a6e8-05281c4fb7fb-kube-api-access-kpcmm" (OuterVolumeSpecName: "kube-api-access-kpcmm") pod "190b58b0-5d43-4753-a6e8-05281c4fb7fb" (UID: "190b58b0-5d43-4753-a6e8-05281c4fb7fb"). InnerVolumeSpecName "kube-api-access-kpcmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.941614 4626 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.941647 4626 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.941665 4626 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.941679 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpcmm\" (UniqueName: \"kubernetes.io/projected/190b58b0-5d43-4753-a6e8-05281c4fb7fb-kube-api-access-kpcmm\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.941689 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/190b58b0-5d43-4753-a6e8-05281c4fb7fb-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:38 crc kubenswrapper[4626]: I0223 06:58:38.941697 4626 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/190b58b0-5d43-4753-a6e8-05281c4fb7fb-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:39 crc kubenswrapper[4626]: I0223 06:58:39.427592 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9j9vm-config-mjbbk" event={"ID":"190b58b0-5d43-4753-a6e8-05281c4fb7fb","Type":"ContainerDied","Data":"e3dd2f5065afc751e87ad1fdb029ad09a7160a31966b6435d8666b00cd8afb18"} Feb 23 06:58:39 crc kubenswrapper[4626]: I0223 06:58:39.427638 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3dd2f5065afc751e87ad1fdb029ad09a7160a31966b6435d8666b00cd8afb18" Feb 23 06:58:39 crc kubenswrapper[4626]: I0223 06:58:39.427710 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9j9vm-config-mjbbk" Feb 23 06:58:39 crc kubenswrapper[4626]: I0223 06:58:39.860141 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9j9vm-config-mjbbk"] Feb 23 06:58:39 crc kubenswrapper[4626]: I0223 06:58:39.865244 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9j9vm-config-mjbbk"] Feb 23 06:58:39 crc kubenswrapper[4626]: I0223 06:58:39.991469 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190b58b0-5d43-4753-a6e8-05281c4fb7fb" path="/var/lib/kubelet/pods/190b58b0-5d43-4753-a6e8-05281c4fb7fb/volumes" Feb 23 06:58:41 crc kubenswrapper[4626]: I0223 06:58:41.441363 4626 generic.go:334] "Generic (PLEG): container finished" podID="421f011a-6b43-4b9d-9fa7-c293dc581234" containerID="bc0424fd284b479b719c4557b4b1273418ef13b422ddf1af6cc32fe49ff9e0c5" exitCode=0 Feb 23 06:58:41 crc kubenswrapper[4626]: I0223 06:58:41.441442 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9lq4g" event={"ID":"421f011a-6b43-4b9d-9fa7-c293dc581234","Type":"ContainerDied","Data":"bc0424fd284b479b719c4557b4b1273418ef13b422ddf1af6cc32fe49ff9e0c5"} Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.803612 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.902724 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-combined-ca-bundle\") pod \"421f011a-6b43-4b9d-9fa7-c293dc581234\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.903100 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-db-sync-config-data\") pod \"421f011a-6b43-4b9d-9fa7-c293dc581234\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.903139 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-config-data\") pod \"421f011a-6b43-4b9d-9fa7-c293dc581234\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.903189 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6qsz\" (UniqueName: \"kubernetes.io/projected/421f011a-6b43-4b9d-9fa7-c293dc581234-kube-api-access-m6qsz\") pod \"421f011a-6b43-4b9d-9fa7-c293dc581234\" (UID: \"421f011a-6b43-4b9d-9fa7-c293dc581234\") " Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.916765 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421f011a-6b43-4b9d-9fa7-c293dc581234-kube-api-access-m6qsz" (OuterVolumeSpecName: "kube-api-access-m6qsz") pod "421f011a-6b43-4b9d-9fa7-c293dc581234" (UID: "421f011a-6b43-4b9d-9fa7-c293dc581234"). InnerVolumeSpecName "kube-api-access-m6qsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.918727 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "421f011a-6b43-4b9d-9fa7-c293dc581234" (UID: "421f011a-6b43-4b9d-9fa7-c293dc581234"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.942855 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "421f011a-6b43-4b9d-9fa7-c293dc581234" (UID: "421f011a-6b43-4b9d-9fa7-c293dc581234"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:42 crc kubenswrapper[4626]: I0223 06:58:42.950676 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-config-data" (OuterVolumeSpecName: "config-data") pod "421f011a-6b43-4b9d-9fa7-c293dc581234" (UID: "421f011a-6b43-4b9d-9fa7-c293dc581234"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.007315 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.007344 4626 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.007355 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421f011a-6b43-4b9d-9fa7-c293dc581234-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.007366 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6qsz\" (UniqueName: \"kubernetes.io/projected/421f011a-6b43-4b9d-9fa7-c293dc581234-kube-api-access-m6qsz\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.456805 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9lq4g" event={"ID":"421f011a-6b43-4b9d-9fa7-c293dc581234","Type":"ContainerDied","Data":"ab95499c2c33070818f15182d30ef04bf67d49ceb49da5e98dafd91229b7231c"} Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.456849 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab95499c2c33070818f15182d30ef04bf67d49ceb49da5e98dafd91229b7231c" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.456873 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9lq4g" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852176 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-99b585b75-58xkk"] Feb 23 06:58:43 crc kubenswrapper[4626]: E0223 06:58:43.852569 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190b58b0-5d43-4753-a6e8-05281c4fb7fb" containerName="ovn-config" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852586 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="190b58b0-5d43-4753-a6e8-05281c4fb7fb" containerName="ovn-config" Feb 23 06:58:43 crc kubenswrapper[4626]: E0223 06:58:43.852598 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421f011a-6b43-4b9d-9fa7-c293dc581234" containerName="glance-db-sync" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852605 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="421f011a-6b43-4b9d-9fa7-c293dc581234" containerName="glance-db-sync" Feb 23 06:58:43 crc kubenswrapper[4626]: E0223 06:58:43.852617 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4f8444-d09b-4213-be5c-585c699d29ae" containerName="swift-ring-rebalance" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852625 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4f8444-d09b-4213-be5c-585c699d29ae" containerName="swift-ring-rebalance" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852804 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="190b58b0-5d43-4753-a6e8-05281c4fb7fb" containerName="ovn-config" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852818 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="421f011a-6b43-4b9d-9fa7-c293dc581234" containerName="glance-db-sync" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.852826 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4f8444-d09b-4213-be5c-585c699d29ae" containerName="swift-ring-rebalance" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.853659 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.880347 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99b585b75-58xkk"] Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.929182 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrtd\" (UniqueName: \"kubernetes.io/projected/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-kube-api-access-clrtd\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.929476 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-dns-svc\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.929549 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-nb\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.929617 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-sb\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:43 crc kubenswrapper[4626]: I0223 06:58:43.929788 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-config\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.032088 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-sb\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.032152 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-config\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.032206 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clrtd\" (UniqueName: \"kubernetes.io/projected/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-kube-api-access-clrtd\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.032236 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-dns-svc\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.032292 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-nb\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.033303 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-nb\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.033297 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-sb\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.033345 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-config\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.033402 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-dns-svc\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.048542 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrtd\" (UniqueName: \"kubernetes.io/projected/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-kube-api-access-clrtd\") pod \"dnsmasq-dns-99b585b75-58xkk\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.173186 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:44 crc kubenswrapper[4626]: I0223 06:58:44.811236 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99b585b75-58xkk"] Feb 23 06:58:45 crc kubenswrapper[4626]: I0223 06:58:45.475982 4626 generic.go:334] "Generic (PLEG): container finished" podID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerID="2634bbd63023c0de3a1f692a02fc08f73ed899056ccd6f23e85d165bbfa74175" exitCode=0 Feb 23 06:58:45 crc kubenswrapper[4626]: I0223 06:58:45.476196 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99b585b75-58xkk" event={"ID":"1c870a4f-2c97-4836-9522-fd73c0a9d3ef","Type":"ContainerDied","Data":"2634bbd63023c0de3a1f692a02fc08f73ed899056ccd6f23e85d165bbfa74175"} Feb 23 06:58:45 crc kubenswrapper[4626]: I0223 06:58:45.477516 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99b585b75-58xkk" event={"ID":"1c870a4f-2c97-4836-9522-fd73c0a9d3ef","Type":"ContainerStarted","Data":"025e803dc16bcd91ec64717b4521a5c89b0ca31980cba7fa3f0769b39723a209"} Feb 23 06:58:46 crc kubenswrapper[4626]: I0223 06:58:46.493320 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99b585b75-58xkk" event={"ID":"1c870a4f-2c97-4836-9522-fd73c0a9d3ef","Type":"ContainerStarted","Data":"f174a150e2af866afa793084424f2b440d6fce9475cb386254854f53188be4db"} Feb 23 06:58:46 crc kubenswrapper[4626]: I0223 06:58:46.493797 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:46 crc kubenswrapper[4626]: I0223 06:58:46.521031 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-99b585b75-58xkk" podStartSLOduration=3.521001468 podStartE2EDuration="3.521001468s" podCreationTimestamp="2026-02-23 06:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:46.511321195 +0000 UTC m=+1078.850650461" watchObservedRunningTime="2026-02-23 06:58:46.521001468 +0000 UTC m=+1078.860330734" Feb 23 06:58:48 crc kubenswrapper[4626]: I0223 06:58:48.316486 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:48 crc kubenswrapper[4626]: I0223 06:58:48.323933 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d736eeb-711a-4553-96ff-2b0d9741ac28-etc-swift\") pod \"swift-storage-0\" (UID: \"5d736eeb-711a-4553-96ff-2b0d9741ac28\") " pod="openstack/swift-storage-0" Feb 23 06:58:48 crc kubenswrapper[4626]: I0223 06:58:48.541778 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.492965 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.517771 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"4e3c8904042498293a76e946dada8885e449cae58c3569c66672279e55b35892"} Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.538647 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.815880 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.856029 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-25mlg"] Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.857208 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-25mlg" Feb 23 06:58:49 crc kubenswrapper[4626]: I0223 06:58:49.884621 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-25mlg"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.008660 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-be2b-account-create-update-958v4"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.009747 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.017012 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.023197 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-be2b-account-create-update-958v4"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.058262 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slks7\" (UniqueName: \"kubernetes.io/projected/51e24e17-251e-4136-96cd-30d6a0fc3ee4-kube-api-access-slks7\") pod \"heat-db-create-25mlg\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.058455 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e24e17-251e-4136-96cd-30d6a0fc3ee4-operator-scripts\") pod \"heat-db-create-25mlg\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.147110 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-pvhs6"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.148240 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.159919 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slks7\" (UniqueName: \"kubernetes.io/projected/51e24e17-251e-4136-96cd-30d6a0fc3ee4-kube-api-access-slks7\") pod \"heat-db-create-25mlg\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.160097 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv76m\" (UniqueName: \"kubernetes.io/projected/62c64a62-5f00-430d-af04-03ec55d5029d-kube-api-access-fv76m\") pod \"cinder-be2b-account-create-update-958v4\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.160225 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e24e17-251e-4136-96cd-30d6a0fc3ee4-operator-scripts\") pod \"heat-db-create-25mlg\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.161011 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e24e17-251e-4136-96cd-30d6a0fc3ee4-operator-scripts\") pod \"heat-db-create-25mlg\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.161199 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62c64a62-5f00-430d-af04-03ec55d5029d-operator-scripts\") pod \"cinder-be2b-account-create-update-958v4\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.172295 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pvhs6"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.194447 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slks7\" (UniqueName: \"kubernetes.io/projected/51e24e17-251e-4136-96cd-30d6a0fc3ee4-kube-api-access-slks7\") pod \"heat-db-create-25mlg\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.260124 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-d699-account-create-update-4hp8b"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.261114 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.262746 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv76m\" (UniqueName: \"kubernetes.io/projected/62c64a62-5f00-430d-af04-03ec55d5029d-kube-api-access-fv76m\") pod \"cinder-be2b-account-create-update-958v4\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.262798 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b08b605-e241-47db-98f5-3b6051c589ee-operator-scripts\") pod \"cinder-db-create-pvhs6\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.262854 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62c64a62-5f00-430d-af04-03ec55d5029d-operator-scripts\") pod \"cinder-be2b-account-create-update-958v4\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.262925 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6d5t\" (UniqueName: \"kubernetes.io/projected/3b08b605-e241-47db-98f5-3b6051c589ee-kube-api-access-g6d5t\") pod \"cinder-db-create-pvhs6\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.264190 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62c64a62-5f00-430d-af04-03ec55d5029d-operator-scripts\") pod \"cinder-be2b-account-create-update-958v4\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.267387 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.275276 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d699-account-create-update-4hp8b"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.302166 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv76m\" (UniqueName: \"kubernetes.io/projected/62c64a62-5f00-430d-af04-03ec55d5029d-kube-api-access-fv76m\") pod \"cinder-be2b-account-create-update-958v4\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.343368 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.364482 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b08b605-e241-47db-98f5-3b6051c589ee-operator-scripts\") pod \"cinder-db-create-pvhs6\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.364634 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6d5t\" (UniqueName: \"kubernetes.io/projected/3b08b605-e241-47db-98f5-3b6051c589ee-kube-api-access-g6d5t\") pod \"cinder-db-create-pvhs6\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.364715 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c4002ff-ea4d-4c5d-a4da-793513d51e83-operator-scripts\") pod \"heat-d699-account-create-update-4hp8b\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.365109 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cjkw\" (UniqueName: \"kubernetes.io/projected/4c4002ff-ea4d-4c5d-a4da-793513d51e83-kube-api-access-4cjkw\") pod \"heat-d699-account-create-update-4hp8b\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.365324 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b08b605-e241-47db-98f5-3b6051c589ee-operator-scripts\") pod \"cinder-db-create-pvhs6\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.386744 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5hrgs"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.387672 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.390901 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-25kl2" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.397366 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.397589 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.397738 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.401570 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6d5t\" (UniqueName: \"kubernetes.io/projected/3b08b605-e241-47db-98f5-3b6051c589ee-kube-api-access-g6d5t\") pod \"cinder-db-create-pvhs6\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.407763 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5hrgs"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.464885 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.466377 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c4002ff-ea4d-4c5d-a4da-793513d51e83-operator-scripts\") pod \"heat-d699-account-create-update-4hp8b\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.467386 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c4002ff-ea4d-4c5d-a4da-793513d51e83-operator-scripts\") pod \"heat-d699-account-create-update-4hp8b\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.477715 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cjkw\" (UniqueName: \"kubernetes.io/projected/4c4002ff-ea4d-4c5d-a4da-793513d51e83-kube-api-access-4cjkw\") pod \"heat-d699-account-create-update-4hp8b\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.483032 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4vwh7"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.486999 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.488093 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-25mlg" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.507674 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4vwh7"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.509989 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cjkw\" (UniqueName: \"kubernetes.io/projected/4c4002ff-ea4d-4c5d-a4da-793513d51e83-kube-api-access-4cjkw\") pod \"heat-d699-account-create-update-4hp8b\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.579688 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-operator-scripts\") pod \"neutron-db-create-4vwh7\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.580005 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68rqd\" (UniqueName: \"kubernetes.io/projected/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-kube-api-access-68rqd\") pod \"neutron-db-create-4vwh7\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.580064 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5nwq\" (UniqueName: \"kubernetes.io/projected/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-kube-api-access-q5nwq\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.580128 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-combined-ca-bundle\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.580177 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-config-data\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.586416 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.632058 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9fa0-account-create-update-nshxv"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.633349 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.639173 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.669513 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-hpjks"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.670809 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.683400 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-combined-ca-bundle\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.683476 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8qr\" (UniqueName: \"kubernetes.io/projected/13a600db-8797-4c3e-98ee-d98d7daf59f9-kube-api-access-6s8qr\") pod \"barbican-db-create-hpjks\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.683520 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-config-data\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.683606 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a600db-8797-4c3e-98ee-d98d7daf59f9-operator-scripts\") pod \"barbican-db-create-hpjks\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.683645 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smtx\" (UniqueName: \"kubernetes.io/projected/c44bbc0b-c334-4a3e-8a51-10394d82a253-kube-api-access-5smtx\") pod \"neutron-9fa0-account-create-update-nshxv\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.687393 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-operator-scripts\") pod \"neutron-db-create-4vwh7\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.687436 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68rqd\" (UniqueName: \"kubernetes.io/projected/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-kube-api-access-68rqd\") pod \"neutron-db-create-4vwh7\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.687492 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44bbc0b-c334-4a3e-8a51-10394d82a253-operator-scripts\") pod \"neutron-9fa0-account-create-update-nshxv\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.687549 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5nwq\" (UniqueName: \"kubernetes.io/projected/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-kube-api-access-q5nwq\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.688371 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-operator-scripts\") pod \"neutron-db-create-4vwh7\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.704367 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-combined-ca-bundle\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.709443 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-config-data\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.727016 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hpjks"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.744121 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68rqd\" (UniqueName: \"kubernetes.io/projected/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-kube-api-access-68rqd\") pod \"neutron-db-create-4vwh7\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.746440 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5nwq\" (UniqueName: \"kubernetes.io/projected/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-kube-api-access-q5nwq\") pod \"keystone-db-sync-5hrgs\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.769590 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9fa0-account-create-update-nshxv"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.772339 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.788763 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44bbc0b-c334-4a3e-8a51-10394d82a253-operator-scripts\") pod \"neutron-9fa0-account-create-update-nshxv\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.789835 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8qr\" (UniqueName: \"kubernetes.io/projected/13a600db-8797-4c3e-98ee-d98d7daf59f9-kube-api-access-6s8qr\") pod \"barbican-db-create-hpjks\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.790005 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a600db-8797-4c3e-98ee-d98d7daf59f9-operator-scripts\") pod \"barbican-db-create-hpjks\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.790098 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smtx\" (UniqueName: \"kubernetes.io/projected/c44bbc0b-c334-4a3e-8a51-10394d82a253-kube-api-access-5smtx\") pod \"neutron-9fa0-account-create-update-nshxv\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.789695 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44bbc0b-c334-4a3e-8a51-10394d82a253-operator-scripts\") pod \"neutron-9fa0-account-create-update-nshxv\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.791252 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a600db-8797-4c3e-98ee-d98d7daf59f9-operator-scripts\") pod \"barbican-db-create-hpjks\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.809508 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-97a9-account-create-update-k7jvv"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.810737 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97a9-account-create-update-k7jvv"] Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.810878 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.819582 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.822285 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.829050 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8qr\" (UniqueName: \"kubernetes.io/projected/13a600db-8797-4c3e-98ee-d98d7daf59f9-kube-api-access-6s8qr\") pod \"barbican-db-create-hpjks\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.832143 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smtx\" (UniqueName: \"kubernetes.io/projected/c44bbc0b-c334-4a3e-8a51-10394d82a253-kube-api-access-5smtx\") pod \"neutron-9fa0-account-create-update-nshxv\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.898198 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb7035a-577f-4291-b0dd-e4ee6e011018-operator-scripts\") pod \"barbican-97a9-account-create-update-k7jvv\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.898859 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8w8\" (UniqueName: \"kubernetes.io/projected/eeb7035a-577f-4291-b0dd-e4ee6e011018-kube-api-access-dc8w8\") pod \"barbican-97a9-account-create-update-k7jvv\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:50 crc kubenswrapper[4626]: I0223 06:58:50.938111 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-be2b-account-create-update-958v4"] Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.000148 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8w8\" (UniqueName: \"kubernetes.io/projected/eeb7035a-577f-4291-b0dd-e4ee6e011018-kube-api-access-dc8w8\") pod \"barbican-97a9-account-create-update-k7jvv\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.000217 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb7035a-577f-4291-b0dd-e4ee6e011018-operator-scripts\") pod \"barbican-97a9-account-create-update-k7jvv\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.000816 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb7035a-577f-4291-b0dd-e4ee6e011018-operator-scripts\") pod \"barbican-97a9-account-create-update-k7jvv\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.050050 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8w8\" (UniqueName: \"kubernetes.io/projected/eeb7035a-577f-4291-b0dd-e4ee6e011018-kube-api-access-dc8w8\") pod \"barbican-97a9-account-create-update-k7jvv\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.051375 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.061097 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.157425 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.320092 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-25mlg"] Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.581528 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-25mlg" event={"ID":"51e24e17-251e-4136-96cd-30d6a0fc3ee4","Type":"ContainerStarted","Data":"9ea0f4483ce0e21468e3021d43633248221b1a12448a62015441dfca133470ec"} Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.592316 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pvhs6"] Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.595322 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-be2b-account-create-update-958v4" event={"ID":"62c64a62-5f00-430d-af04-03ec55d5029d","Type":"ContainerStarted","Data":"799ea3c4c27f9b76b968a94c0ac8a653eac2bc08dea4c8be759b7fe141e64e60"} Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.595373 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-be2b-account-create-update-958v4" event={"ID":"62c64a62-5f00-430d-af04-03ec55d5029d","Type":"ContainerStarted","Data":"0fe27ea3cda0aa8e00f6afffc6ac58243d138a1d8aaaaba499d5cbac0be24b31"} Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.644230 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-be2b-account-create-update-958v4" podStartSLOduration=2.6442129359999997 podStartE2EDuration="2.644212936s" podCreationTimestamp="2026-02-23 06:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:51.619481276 +0000 UTC m=+1083.958810541" watchObservedRunningTime="2026-02-23 06:58:51.644212936 +0000 UTC m=+1083.983542203" Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.692883 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d699-account-create-update-4hp8b"] Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.746164 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5hrgs"] Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.753860 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4vwh7"] Feb 23 06:58:51 crc kubenswrapper[4626]: W0223 06:58:51.764670 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48794fdc_dcd4_4e56_88ea_6628ef7b4b80.slice/crio-21624015f404b7259f64be9b65acf739cd959877797a67e64474a2c355071b9c WatchSource:0}: Error finding container 21624015f404b7259f64be9b65acf739cd959877797a67e64474a2c355071b9c: Status 404 returned error can't find the container with id 21624015f404b7259f64be9b65acf739cd959877797a67e64474a2c355071b9c Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.907143 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hpjks"] Feb 23 06:58:51 crc kubenswrapper[4626]: I0223 06:58:51.940531 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9fa0-account-create-update-nshxv"] Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.020957 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97a9-account-create-update-k7jvv"] Feb 23 06:58:52 crc kubenswrapper[4626]: W0223 06:58:52.033832 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb7035a_577f_4291_b0dd_e4ee6e011018.slice/crio-5e4ab3dcea4a1b12b6687837dd33204765d26a14f46a6dceb739d5c3d3419329 WatchSource:0}: Error finding container 5e4ab3dcea4a1b12b6687837dd33204765d26a14f46a6dceb739d5c3d3419329: Status 404 returned error can't find the container with id 5e4ab3dcea4a1b12b6687837dd33204765d26a14f46a6dceb739d5c3d3419329 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.604561 4626 generic.go:334] "Generic (PLEG): container finished" podID="c44bbc0b-c334-4a3e-8a51-10394d82a253" containerID="296b5ead7448f29c39b2e2af1237718207ca790f3206d8ca64b163f689cee403" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.604632 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fa0-account-create-update-nshxv" event={"ID":"c44bbc0b-c334-4a3e-8a51-10394d82a253","Type":"ContainerDied","Data":"296b5ead7448f29c39b2e2af1237718207ca790f3206d8ca64b163f689cee403"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.604663 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fa0-account-create-update-nshxv" event={"ID":"c44bbc0b-c334-4a3e-8a51-10394d82a253","Type":"ContainerStarted","Data":"fa76023f0e402e322f48c6c71fcfaba3b189be9afb6f14e90c78247e43eb4f11"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.615120 4626 generic.go:334] "Generic (PLEG): container finished" podID="eeb7035a-577f-4291-b0dd-e4ee6e011018" containerID="8eaf17ab19ba278ff22cac801526ab596f6d7cf5023260078ab0a2da276d7106" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.615204 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97a9-account-create-update-k7jvv" event={"ID":"eeb7035a-577f-4291-b0dd-e4ee6e011018","Type":"ContainerDied","Data":"8eaf17ab19ba278ff22cac801526ab596f6d7cf5023260078ab0a2da276d7106"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.615259 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97a9-account-create-update-k7jvv" event={"ID":"eeb7035a-577f-4291-b0dd-e4ee6e011018","Type":"ContainerStarted","Data":"5e4ab3dcea4a1b12b6687837dd33204765d26a14f46a6dceb739d5c3d3419329"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.617575 4626 generic.go:334] "Generic (PLEG): container finished" podID="13a600db-8797-4c3e-98ee-d98d7daf59f9" containerID="297990957fc057e6488e74b739fedf58792340d719618b5015b3c40259b71805" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.617675 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hpjks" event={"ID":"13a600db-8797-4c3e-98ee-d98d7daf59f9","Type":"ContainerDied","Data":"297990957fc057e6488e74b739fedf58792340d719618b5015b3c40259b71805"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.617707 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hpjks" event={"ID":"13a600db-8797-4c3e-98ee-d98d7daf59f9","Type":"ContainerStarted","Data":"58aacb566936b3601edb3fe4d93ed0ac10c0a0fe6166144898bedec60f7feb01"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.619239 4626 generic.go:334] "Generic (PLEG): container finished" podID="51e24e17-251e-4136-96cd-30d6a0fc3ee4" containerID="5a10d71f4f20accc93d10a575735e6934c5a3db8378b2601d8eb0bfcb5298786" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.619299 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-25mlg" event={"ID":"51e24e17-251e-4136-96cd-30d6a0fc3ee4","Type":"ContainerDied","Data":"5a10d71f4f20accc93d10a575735e6934c5a3db8378b2601d8eb0bfcb5298786"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.630778 4626 generic.go:334] "Generic (PLEG): container finished" podID="62c64a62-5f00-430d-af04-03ec55d5029d" containerID="799ea3c4c27f9b76b968a94c0ac8a653eac2bc08dea4c8be759b7fe141e64e60" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.631022 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-be2b-account-create-update-958v4" event={"ID":"62c64a62-5f00-430d-af04-03ec55d5029d","Type":"ContainerDied","Data":"799ea3c4c27f9b76b968a94c0ac8a653eac2bc08dea4c8be759b7fe141e64e60"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.634075 4626 generic.go:334] "Generic (PLEG): container finished" podID="4c4002ff-ea4d-4c5d-a4da-793513d51e83" containerID="eec1e8e901634153f11a7a885ba0be5973fd9517d02f370a7f6ada4f19c7a8ca" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.634120 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d699-account-create-update-4hp8b" event={"ID":"4c4002ff-ea4d-4c5d-a4da-793513d51e83","Type":"ContainerDied","Data":"eec1e8e901634153f11a7a885ba0be5973fd9517d02f370a7f6ada4f19c7a8ca"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.634165 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d699-account-create-update-4hp8b" event={"ID":"4c4002ff-ea4d-4c5d-a4da-793513d51e83","Type":"ContainerStarted","Data":"5f87837eb8a7fe918011a5b3090c9a38233c9d5150e79245b4cdbedd6e1100b1"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.642366 4626 generic.go:334] "Generic (PLEG): container finished" podID="3b08b605-e241-47db-98f5-3b6051c589ee" containerID="aeb9eefa3727ba91125e0fcde1f5155a3590ae339ab8bbca8134de8447fbef33" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.642429 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pvhs6" event={"ID":"3b08b605-e241-47db-98f5-3b6051c589ee","Type":"ContainerDied","Data":"aeb9eefa3727ba91125e0fcde1f5155a3590ae339ab8bbca8134de8447fbef33"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.642461 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pvhs6" event={"ID":"3b08b605-e241-47db-98f5-3b6051c589ee","Type":"ContainerStarted","Data":"8f5dc2b712218e51850aedd14caa4a2628001be97d52d4036d0cb02a058a4f22"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.643529 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5hrgs" event={"ID":"f759b0fe-50ac-4eb6-8539-c34dcf9cf501","Type":"ContainerStarted","Data":"89177a5eef2b9f4cdf3b97058c3078954733d3e4a37ba1db577770335ae670de"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.644593 4626 generic.go:334] "Generic (PLEG): container finished" podID="48794fdc-dcd4-4e56-88ea-6628ef7b4b80" containerID="cefb121300a0e7a9bd8e74f118c263b7612f556b4ea99e6b50d61895b38954e6" exitCode=0 Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.644625 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4vwh7" event={"ID":"48794fdc-dcd4-4e56-88ea-6628ef7b4b80","Type":"ContainerDied","Data":"cefb121300a0e7a9bd8e74f118c263b7612f556b4ea99e6b50d61895b38954e6"} Feb 23 06:58:52 crc kubenswrapper[4626]: I0223 06:58:52.644643 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4vwh7" event={"ID":"48794fdc-dcd4-4e56-88ea-6628ef7b4b80","Type":"ContainerStarted","Data":"21624015f404b7259f64be9b65acf739cd959877797a67e64474a2c355071b9c"} Feb 23 06:58:53 crc kubenswrapper[4626]: I0223 06:58:53.659397 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"45a856a8300bf2f6b0f383689da7818837b3ef2a8ec6d02605e0278be6a293a2"} Feb 23 06:58:53 crc kubenswrapper[4626]: I0223 06:58:53.660203 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"a392885a0929044a8c68141e5b860cb8870d7daaa0976b9152bb7c8dc4a82e8b"} Feb 23 06:58:53 crc kubenswrapper[4626]: I0223 06:58:53.660223 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"71bc1cc7445633c923bc4c0ac7b09e1e6552346694ab7837ef1ae34603e90fa3"} Feb 23 06:58:53 crc kubenswrapper[4626]: I0223 06:58:53.660233 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"62eab858cfd754678ea0ba38d52392afa778531751cfb47ba49eab935cfbd4c8"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.104750 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.174798 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.211934 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6d5t\" (UniqueName: \"kubernetes.io/projected/3b08b605-e241-47db-98f5-3b6051c589ee-kube-api-access-g6d5t\") pod \"3b08b605-e241-47db-98f5-3b6051c589ee\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.212086 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b08b605-e241-47db-98f5-3b6051c589ee-operator-scripts\") pod \"3b08b605-e241-47db-98f5-3b6051c589ee\" (UID: \"3b08b605-e241-47db-98f5-3b6051c589ee\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.216747 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b08b605-e241-47db-98f5-3b6051c589ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b08b605-e241-47db-98f5-3b6051c589ee" (UID: "3b08b605-e241-47db-98f5-3b6051c589ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.257062 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-687c99d675-v44bm"] Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.257382 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-687c99d675-v44bm" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerName="dnsmasq-dns" containerID="cri-o://df891bf3b7a98e443ae0db320f1a88d8910f36d0a315cf9b4ff41dc3dde7ab7c" gracePeriod=10 Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.264207 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b08b605-e241-47db-98f5-3b6051c589ee-kube-api-access-g6d5t" (OuterVolumeSpecName: "kube-api-access-g6d5t") pod "3b08b605-e241-47db-98f5-3b6051c589ee" (UID: "3b08b605-e241-47db-98f5-3b6051c589ee"). InnerVolumeSpecName "kube-api-access-g6d5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.315387 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6d5t\" (UniqueName: \"kubernetes.io/projected/3b08b605-e241-47db-98f5-3b6051c589ee-kube-api-access-g6d5t\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.315408 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b08b605-e241-47db-98f5-3b6051c589ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.395561 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.404149 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.412736 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-25mlg" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.465822 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.468020 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.491449 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.495560 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.519333 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cjkw\" (UniqueName: \"kubernetes.io/projected/4c4002ff-ea4d-4c5d-a4da-793513d51e83-kube-api-access-4cjkw\") pod \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.519517 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-operator-scripts\") pod \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.519562 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slks7\" (UniqueName: \"kubernetes.io/projected/51e24e17-251e-4136-96cd-30d6a0fc3ee4-kube-api-access-slks7\") pod \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.519642 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68rqd\" (UniqueName: \"kubernetes.io/projected/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-kube-api-access-68rqd\") pod \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\" (UID: \"48794fdc-dcd4-4e56-88ea-6628ef7b4b80\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.519708 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c4002ff-ea4d-4c5d-a4da-793513d51e83-operator-scripts\") pod \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\" (UID: \"4c4002ff-ea4d-4c5d-a4da-793513d51e83\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.519735 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e24e17-251e-4136-96cd-30d6a0fc3ee4-operator-scripts\") pod \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\" (UID: \"51e24e17-251e-4136-96cd-30d6a0fc3ee4\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.521348 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e24e17-251e-4136-96cd-30d6a0fc3ee4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51e24e17-251e-4136-96cd-30d6a0fc3ee4" (UID: "51e24e17-251e-4136-96cd-30d6a0fc3ee4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.525145 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e24e17-251e-4136-96cd-30d6a0fc3ee4-kube-api-access-slks7" (OuterVolumeSpecName: "kube-api-access-slks7") pod "51e24e17-251e-4136-96cd-30d6a0fc3ee4" (UID: "51e24e17-251e-4136-96cd-30d6a0fc3ee4"). InnerVolumeSpecName "kube-api-access-slks7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.528044 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48794fdc-dcd4-4e56-88ea-6628ef7b4b80" (UID: "48794fdc-dcd4-4e56-88ea-6628ef7b4b80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.528030 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4002ff-ea4d-4c5d-a4da-793513d51e83-kube-api-access-4cjkw" (OuterVolumeSpecName: "kube-api-access-4cjkw") pod "4c4002ff-ea4d-4c5d-a4da-793513d51e83" (UID: "4c4002ff-ea4d-4c5d-a4da-793513d51e83"). InnerVolumeSpecName "kube-api-access-4cjkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.528166 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4002ff-ea4d-4c5d-a4da-793513d51e83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c4002ff-ea4d-4c5d-a4da-793513d51e83" (UID: "4c4002ff-ea4d-4c5d-a4da-793513d51e83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.531935 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-kube-api-access-68rqd" (OuterVolumeSpecName: "kube-api-access-68rqd") pod "48794fdc-dcd4-4e56-88ea-6628ef7b4b80" (UID: "48794fdc-dcd4-4e56-88ea-6628ef7b4b80"). InnerVolumeSpecName "kube-api-access-68rqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.621817 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44bbc0b-c334-4a3e-8a51-10394d82a253-operator-scripts\") pod \"c44bbc0b-c334-4a3e-8a51-10394d82a253\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.621866 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv76m\" (UniqueName: \"kubernetes.io/projected/62c64a62-5f00-430d-af04-03ec55d5029d-kube-api-access-fv76m\") pod \"62c64a62-5f00-430d-af04-03ec55d5029d\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.621888 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44bbc0b-c334-4a3e-8a51-10394d82a253-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c44bbc0b-c334-4a3e-8a51-10394d82a253" (UID: "c44bbc0b-c334-4a3e-8a51-10394d82a253"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.621962 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb7035a-577f-4291-b0dd-e4ee6e011018-operator-scripts\") pod \"eeb7035a-577f-4291-b0dd-e4ee6e011018\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.621994 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62c64a62-5f00-430d-af04-03ec55d5029d-operator-scripts\") pod \"62c64a62-5f00-430d-af04-03ec55d5029d\" (UID: \"62c64a62-5f00-430d-af04-03ec55d5029d\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622124 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a600db-8797-4c3e-98ee-d98d7daf59f9-operator-scripts\") pod \"13a600db-8797-4c3e-98ee-d98d7daf59f9\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622264 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8w8\" (UniqueName: \"kubernetes.io/projected/eeb7035a-577f-4291-b0dd-e4ee6e011018-kube-api-access-dc8w8\") pod \"eeb7035a-577f-4291-b0dd-e4ee6e011018\" (UID: \"eeb7035a-577f-4291-b0dd-e4ee6e011018\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622346 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5smtx\" (UniqueName: \"kubernetes.io/projected/c44bbc0b-c334-4a3e-8a51-10394d82a253-kube-api-access-5smtx\") pod \"c44bbc0b-c334-4a3e-8a51-10394d82a253\" (UID: \"c44bbc0b-c334-4a3e-8a51-10394d82a253\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622385 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s8qr\" (UniqueName: \"kubernetes.io/projected/13a600db-8797-4c3e-98ee-d98d7daf59f9-kube-api-access-6s8qr\") pod \"13a600db-8797-4c3e-98ee-d98d7daf59f9\" (UID: \"13a600db-8797-4c3e-98ee-d98d7daf59f9\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622830 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slks7\" (UniqueName: \"kubernetes.io/projected/51e24e17-251e-4136-96cd-30d6a0fc3ee4-kube-api-access-slks7\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622844 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68rqd\" (UniqueName: \"kubernetes.io/projected/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-kube-api-access-68rqd\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622854 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c4002ff-ea4d-4c5d-a4da-793513d51e83-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622863 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e24e17-251e-4136-96cd-30d6a0fc3ee4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622872 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cjkw\" (UniqueName: \"kubernetes.io/projected/4c4002ff-ea4d-4c5d-a4da-793513d51e83-kube-api-access-4cjkw\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622883 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c44bbc0b-c334-4a3e-8a51-10394d82a253-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.622891 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48794fdc-dcd4-4e56-88ea-6628ef7b4b80-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.623289 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a600db-8797-4c3e-98ee-d98d7daf59f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13a600db-8797-4c3e-98ee-d98d7daf59f9" (UID: "13a600db-8797-4c3e-98ee-d98d7daf59f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.623701 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb7035a-577f-4291-b0dd-e4ee6e011018-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eeb7035a-577f-4291-b0dd-e4ee6e011018" (UID: "eeb7035a-577f-4291-b0dd-e4ee6e011018"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.624419 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c64a62-5f00-430d-af04-03ec55d5029d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62c64a62-5f00-430d-af04-03ec55d5029d" (UID: "62c64a62-5f00-430d-af04-03ec55d5029d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.624886 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c64a62-5f00-430d-af04-03ec55d5029d-kube-api-access-fv76m" (OuterVolumeSpecName: "kube-api-access-fv76m") pod "62c64a62-5f00-430d-af04-03ec55d5029d" (UID: "62c64a62-5f00-430d-af04-03ec55d5029d"). InnerVolumeSpecName "kube-api-access-fv76m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.629466 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44bbc0b-c334-4a3e-8a51-10394d82a253-kube-api-access-5smtx" (OuterVolumeSpecName: "kube-api-access-5smtx") pod "c44bbc0b-c334-4a3e-8a51-10394d82a253" (UID: "c44bbc0b-c334-4a3e-8a51-10394d82a253"). InnerVolumeSpecName "kube-api-access-5smtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.633410 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb7035a-577f-4291-b0dd-e4ee6e011018-kube-api-access-dc8w8" (OuterVolumeSpecName: "kube-api-access-dc8w8") pod "eeb7035a-577f-4291-b0dd-e4ee6e011018" (UID: "eeb7035a-577f-4291-b0dd-e4ee6e011018"). InnerVolumeSpecName "kube-api-access-dc8w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.661984 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a600db-8797-4c3e-98ee-d98d7daf59f9-kube-api-access-6s8qr" (OuterVolumeSpecName: "kube-api-access-6s8qr") pod "13a600db-8797-4c3e-98ee-d98d7daf59f9" (UID: "13a600db-8797-4c3e-98ee-d98d7daf59f9"). InnerVolumeSpecName "kube-api-access-6s8qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.680405 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-25mlg" event={"ID":"51e24e17-251e-4136-96cd-30d6a0fc3ee4","Type":"ContainerDied","Data":"9ea0f4483ce0e21468e3021d43633248221b1a12448a62015441dfca133470ec"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.680440 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea0f4483ce0e21468e3021d43633248221b1a12448a62015441dfca133470ec" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.680523 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-25mlg" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.683509 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-be2b-account-create-update-958v4" event={"ID":"62c64a62-5f00-430d-af04-03ec55d5029d","Type":"ContainerDied","Data":"0fe27ea3cda0aa8e00f6afffc6ac58243d138a1d8aaaaba499d5cbac0be24b31"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.683547 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fe27ea3cda0aa8e00f6afffc6ac58243d138a1d8aaaaba499d5cbac0be24b31" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.683616 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-be2b-account-create-update-958v4" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.692811 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4vwh7" event={"ID":"48794fdc-dcd4-4e56-88ea-6628ef7b4b80","Type":"ContainerDied","Data":"21624015f404b7259f64be9b65acf739cd959877797a67e64474a2c355071b9c"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.692843 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21624015f404b7259f64be9b65acf739cd959877797a67e64474a2c355071b9c" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.692904 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4vwh7" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.699876 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9fa0-account-create-update-nshxv" event={"ID":"c44bbc0b-c334-4a3e-8a51-10394d82a253","Type":"ContainerDied","Data":"fa76023f0e402e322f48c6c71fcfaba3b189be9afb6f14e90c78247e43eb4f11"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.699913 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa76023f0e402e322f48c6c71fcfaba3b189be9afb6f14e90c78247e43eb4f11" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.699966 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9fa0-account-create-update-nshxv" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.719437 4626 generic.go:334] "Generic (PLEG): container finished" podID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerID="df891bf3b7a98e443ae0db320f1a88d8910f36d0a315cf9b4ff41dc3dde7ab7c" exitCode=0 Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.719485 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687c99d675-v44bm" event={"ID":"70d23afb-92d9-44e1-896e-c048ca8fe3d7","Type":"ContainerDied","Data":"df891bf3b7a98e443ae0db320f1a88d8910f36d0a315cf9b4ff41dc3dde7ab7c"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.721094 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d699-account-create-update-4hp8b" event={"ID":"4c4002ff-ea4d-4c5d-a4da-793513d51e83","Type":"ContainerDied","Data":"5f87837eb8a7fe918011a5b3090c9a38233c9d5150e79245b4cdbedd6e1100b1"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.721116 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f87837eb8a7fe918011a5b3090c9a38233c9d5150e79245b4cdbedd6e1100b1" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.721160 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d699-account-create-update-4hp8b" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.729421 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97a9-account-create-update-k7jvv" event={"ID":"eeb7035a-577f-4291-b0dd-e4ee6e011018","Type":"ContainerDied","Data":"5e4ab3dcea4a1b12b6687837dd33204765d26a14f46a6dceb739d5c3d3419329"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.729475 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e4ab3dcea4a1b12b6687837dd33204765d26a14f46a6dceb739d5c3d3419329" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.729572 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97a9-account-create-update-k7jvv" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.734765 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pvhs6" event={"ID":"3b08b605-e241-47db-98f5-3b6051c589ee","Type":"ContainerDied","Data":"8f5dc2b712218e51850aedd14caa4a2628001be97d52d4036d0cb02a058a4f22"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.736705 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f5dc2b712218e51850aedd14caa4a2628001be97d52d4036d0cb02a058a4f22" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.736611 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pvhs6" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740210 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8w8\" (UniqueName: \"kubernetes.io/projected/eeb7035a-577f-4291-b0dd-e4ee6e011018-kube-api-access-dc8w8\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740328 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5smtx\" (UniqueName: \"kubernetes.io/projected/c44bbc0b-c334-4a3e-8a51-10394d82a253-kube-api-access-5smtx\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740720 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s8qr\" (UniqueName: \"kubernetes.io/projected/13a600db-8797-4c3e-98ee-d98d7daf59f9-kube-api-access-6s8qr\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740759 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv76m\" (UniqueName: \"kubernetes.io/projected/62c64a62-5f00-430d-af04-03ec55d5029d-kube-api-access-fv76m\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740771 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb7035a-577f-4291-b0dd-e4ee6e011018-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740781 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62c64a62-5f00-430d-af04-03ec55d5029d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.740793 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13a600db-8797-4c3e-98ee-d98d7daf59f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.744318 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hpjks" event={"ID":"13a600db-8797-4c3e-98ee-d98d7daf59f9","Type":"ContainerDied","Data":"58aacb566936b3601edb3fe4d93ed0ac10c0a0fe6166144898bedec60f7feb01"} Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.744357 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58aacb566936b3601edb3fe4d93ed0ac10c0a0fe6166144898bedec60f7feb01" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.744427 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hpjks" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.827395 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.844294 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-nb\") pod \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.844341 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-config\") pod \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.844363 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-sb\") pod \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.844433 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22w2\" (UniqueName: \"kubernetes.io/projected/70d23afb-92d9-44e1-896e-c048ca8fe3d7-kube-api-access-j22w2\") pod \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.844469 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-dns-svc\") pod \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\" (UID: \"70d23afb-92d9-44e1-896e-c048ca8fe3d7\") " Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.856369 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d23afb-92d9-44e1-896e-c048ca8fe3d7-kube-api-access-j22w2" (OuterVolumeSpecName: "kube-api-access-j22w2") pod "70d23afb-92d9-44e1-896e-c048ca8fe3d7" (UID: "70d23afb-92d9-44e1-896e-c048ca8fe3d7"). InnerVolumeSpecName "kube-api-access-j22w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.886927 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70d23afb-92d9-44e1-896e-c048ca8fe3d7" (UID: "70d23afb-92d9-44e1-896e-c048ca8fe3d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.891121 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70d23afb-92d9-44e1-896e-c048ca8fe3d7" (UID: "70d23afb-92d9-44e1-896e-c048ca8fe3d7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.893677 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70d23afb-92d9-44e1-896e-c048ca8fe3d7" (UID: "70d23afb-92d9-44e1-896e-c048ca8fe3d7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.902085 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-config" (OuterVolumeSpecName: "config") pod "70d23afb-92d9-44e1-896e-c048ca8fe3d7" (UID: "70d23afb-92d9-44e1-896e-c048ca8fe3d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.946865 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.946899 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.946911 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.946921 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j22w2\" (UniqueName: \"kubernetes.io/projected/70d23afb-92d9-44e1-896e-c048ca8fe3d7-kube-api-access-j22w2\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:54 crc kubenswrapper[4626]: I0223 06:58:54.946932 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70d23afb-92d9-44e1-896e-c048ca8fe3d7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.765792 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"df325439234d2e2541bb56b43611a63f4896a22880ce4e09c70bf1c3eef0aa03"} Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.766128 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"3968d1aed3819c37e82ec8d29fd09b696fc8ff115be092efe716dae0c9cc59c6"} Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.767381 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-687c99d675-v44bm" event={"ID":"70d23afb-92d9-44e1-896e-c048ca8fe3d7","Type":"ContainerDied","Data":"222ecd550da2468a5929182a00798ffa83be762887eaf59bfdfbf4ebd190583c"} Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.767422 4626 scope.go:117] "RemoveContainer" containerID="df891bf3b7a98e443ae0db320f1a88d8910f36d0a315cf9b4ff41dc3dde7ab7c" Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.767553 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-687c99d675-v44bm" Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.804795 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-687c99d675-v44bm"] Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.810352 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-687c99d675-v44bm"] Feb 23 06:58:55 crc kubenswrapper[4626]: I0223 06:58:55.999602 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" path="/var/lib/kubelet/pods/70d23afb-92d9-44e1-896e-c048ca8fe3d7/volumes" Feb 23 06:58:57 crc kubenswrapper[4626]: I0223 06:58:57.952784 4626 scope.go:117] "RemoveContainer" containerID="4f960e4e86825d8b344c4b5db81ff620589505fbebf30b2cb10a233cc9ef51af" Feb 23 06:58:58 crc kubenswrapper[4626]: I0223 06:58:58.805331 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5hrgs" event={"ID":"f759b0fe-50ac-4eb6-8539-c34dcf9cf501","Type":"ContainerStarted","Data":"8be512aca6c69f2b206dc98db98e60f65e91a97d1accaff62890686b6521a231"} Feb 23 06:58:58 crc kubenswrapper[4626]: I0223 06:58:58.816172 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"56d68211016fee1c106cbe9290f3b4f33a0b779457e4d8eea14efc9e2fa937f9"} Feb 23 06:58:58 crc kubenswrapper[4626]: I0223 06:58:58.816264 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"bf941febdc440c7647cf00a3b69fabcf34321cd1e8e358220234175ad19d16c0"} Feb 23 06:58:58 crc kubenswrapper[4626]: I0223 06:58:58.831879 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5hrgs" podStartSLOduration=2.581200134 podStartE2EDuration="8.831866713s" podCreationTimestamp="2026-02-23 06:58:50 +0000 UTC" firstStartedPulling="2026-02-23 06:58:51.751140266 +0000 UTC m=+1084.090469552" lastFinishedPulling="2026-02-23 06:58:58.001806865 +0000 UTC m=+1090.341136131" observedRunningTime="2026-02-23 06:58:58.825626788 +0000 UTC m=+1091.164956054" watchObservedRunningTime="2026-02-23 06:58:58.831866713 +0000 UTC m=+1091.171195979" Feb 23 06:59:00 crc kubenswrapper[4626]: I0223 06:59:00.834573 4626 generic.go:334] "Generic (PLEG): container finished" podID="f759b0fe-50ac-4eb6-8539-c34dcf9cf501" containerID="8be512aca6c69f2b206dc98db98e60f65e91a97d1accaff62890686b6521a231" exitCode=0 Feb 23 06:59:00 crc kubenswrapper[4626]: I0223 06:59:00.834658 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5hrgs" event={"ID":"f759b0fe-50ac-4eb6-8539-c34dcf9cf501","Type":"ContainerDied","Data":"8be512aca6c69f2b206dc98db98e60f65e91a97d1accaff62890686b6521a231"} Feb 23 06:59:00 crc kubenswrapper[4626]: I0223 06:59:00.842801 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"db00c2cac16b91bffc107436ad17337ca830ccd4016d183c8cdf707f0c5a81f3"} Feb 23 06:59:00 crc kubenswrapper[4626]: I0223 06:59:00.842838 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"507017c40d1cc989c8a3446fd47e0149b191e9189d3363792f1f6c9b46623ff6"} Feb 23 06:59:00 crc kubenswrapper[4626]: I0223 06:59:00.842849 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"51451449172b85e48e6f7160d6c3a89984fb59053a454a5a542d81cd6c8f7cfc"} Feb 23 06:59:00 crc kubenswrapper[4626]: I0223 06:59:00.842858 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"10171f1f2e181b30adc376c6b09e64c19d8a840b668ec53a48d43bc1532e660b"} Feb 23 06:59:01 crc kubenswrapper[4626]: I0223 06:59:01.858576 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"15a848b0e992d690411a6f19e0c813ee193b849e594b3529c87bae43d1be1d8b"} Feb 23 06:59:01 crc kubenswrapper[4626]: I0223 06:59:01.858917 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"16ff1ecde756f436c69b3a24ba29e3d7ace9f62b2d054613f081f8c111784407"} Feb 23 06:59:01 crc kubenswrapper[4626]: I0223 06:59:01.858934 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5d736eeb-711a-4553-96ff-2b0d9741ac28","Type":"ContainerStarted","Data":"35820c476bdd401affa7006880070c5cf86726504e257477088372a3142d5113"} Feb 23 06:59:01 crc kubenswrapper[4626]: I0223 06:59:01.893066 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.259828901 podStartE2EDuration="46.89305018s" podCreationTimestamp="2026-02-23 06:58:15 +0000 UTC" firstStartedPulling="2026-02-23 06:58:49.501899868 +0000 UTC m=+1081.841229134" lastFinishedPulling="2026-02-23 06:59:00.135121147 +0000 UTC m=+1092.474450413" observedRunningTime="2026-02-23 06:59:01.891400219 +0000 UTC m=+1094.230729485" watchObservedRunningTime="2026-02-23 06:59:01.89305018 +0000 UTC m=+1094.232379446" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.145908 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65cf6c888f-pj9gf"] Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146640 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerName="dnsmasq-dns" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146665 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerName="dnsmasq-dns" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146681 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerName="init" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146688 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerName="init" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146697 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b08b605-e241-47db-98f5-3b6051c589ee" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146713 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b08b605-e241-47db-98f5-3b6051c589ee" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146724 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a600db-8797-4c3e-98ee-d98d7daf59f9" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146730 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a600db-8797-4c3e-98ee-d98d7daf59f9" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146740 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e24e17-251e-4136-96cd-30d6a0fc3ee4" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146746 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e24e17-251e-4136-96cd-30d6a0fc3ee4" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146759 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4002ff-ea4d-4c5d-a4da-793513d51e83" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146766 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4002ff-ea4d-4c5d-a4da-793513d51e83" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146782 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48794fdc-dcd4-4e56-88ea-6628ef7b4b80" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146788 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="48794fdc-dcd4-4e56-88ea-6628ef7b4b80" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146802 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c64a62-5f00-430d-af04-03ec55d5029d" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146808 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c64a62-5f00-430d-af04-03ec55d5029d" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146819 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44bbc0b-c334-4a3e-8a51-10394d82a253" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146825 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44bbc0b-c334-4a3e-8a51-10394d82a253" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: E0223 06:59:02.146834 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb7035a-577f-4291-b0dd-e4ee6e011018" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.146840 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb7035a-577f-4291-b0dd-e4ee6e011018" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147022 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e24e17-251e-4136-96cd-30d6a0fc3ee4" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147034 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="48794fdc-dcd4-4e56-88ea-6628ef7b4b80" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147044 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44bbc0b-c334-4a3e-8a51-10394d82a253" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147059 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d23afb-92d9-44e1-896e-c048ca8fe3d7" containerName="dnsmasq-dns" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147066 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b08b605-e241-47db-98f5-3b6051c589ee" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147076 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb7035a-577f-4291-b0dd-e4ee6e011018" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147082 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c64a62-5f00-430d-af04-03ec55d5029d" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147091 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4002ff-ea4d-4c5d-a4da-793513d51e83" containerName="mariadb-account-create-update" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147098 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a600db-8797-4c3e-98ee-d98d7daf59f9" containerName="mariadb-database-create" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.147991 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.153861 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.172104 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cf6c888f-pj9gf"] Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.175599 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235070 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-config-data\") pod \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235169 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-combined-ca-bundle\") pod \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235212 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5nwq\" (UniqueName: \"kubernetes.io/projected/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-kube-api-access-q5nwq\") pod \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\" (UID: \"f759b0fe-50ac-4eb6-8539-c34dcf9cf501\") " Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235515 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-svc\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235556 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-swift-storage-0\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235658 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-sb\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235705 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-config\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235730 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-nb\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.235748 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb872\" (UniqueName: \"kubernetes.io/projected/ec1abdc0-a49f-4669-9444-f4eea88e1066-kube-api-access-hb872\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.255457 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-kube-api-access-q5nwq" (OuterVolumeSpecName: "kube-api-access-q5nwq") pod "f759b0fe-50ac-4eb6-8539-c34dcf9cf501" (UID: "f759b0fe-50ac-4eb6-8539-c34dcf9cf501"). InnerVolumeSpecName "kube-api-access-q5nwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.263013 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f759b0fe-50ac-4eb6-8539-c34dcf9cf501" (UID: "f759b0fe-50ac-4eb6-8539-c34dcf9cf501"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.282564 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-config-data" (OuterVolumeSpecName: "config-data") pod "f759b0fe-50ac-4eb6-8539-c34dcf9cf501" (UID: "f759b0fe-50ac-4eb6-8539-c34dcf9cf501"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.336863 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-sb\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.336921 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-config\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.336950 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-nb\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.336967 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb872\" (UniqueName: \"kubernetes.io/projected/ec1abdc0-a49f-4669-9444-f4eea88e1066-kube-api-access-hb872\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.337010 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-svc\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.337068 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-swift-storage-0\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.337117 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.337127 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.337138 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5nwq\" (UniqueName: \"kubernetes.io/projected/f759b0fe-50ac-4eb6-8539-c34dcf9cf501-kube-api-access-q5nwq\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.339180 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-svc\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.339298 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-swift-storage-0\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.339357 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-config\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.339475 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-sb\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.339478 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-nb\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.350429 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb872\" (UniqueName: \"kubernetes.io/projected/ec1abdc0-a49f-4669-9444-f4eea88e1066-kube-api-access-hb872\") pod \"dnsmasq-dns-65cf6c888f-pj9gf\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.493527 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.870242 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5hrgs" event={"ID":"f759b0fe-50ac-4eb6-8539-c34dcf9cf501","Type":"ContainerDied","Data":"89177a5eef2b9f4cdf3b97058c3078954733d3e4a37ba1db577770335ae670de"} Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.870607 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89177a5eef2b9f4cdf3b97058c3078954733d3e4a37ba1db577770335ae670de" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.870277 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5hrgs" Feb 23 06:59:02 crc kubenswrapper[4626]: I0223 06:59:02.924767 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cf6c888f-pj9gf"] Feb 23 06:59:02 crc kubenswrapper[4626]: W0223 06:59:02.928961 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1abdc0_a49f_4669_9444_f4eea88e1066.slice/crio-4b3069958c5d1bda441f59a1f88d1a0abb1219cbfdef4d31e38ae244ed3306d0 WatchSource:0}: Error finding container 4b3069958c5d1bda441f59a1f88d1a0abb1219cbfdef4d31e38ae244ed3306d0: Status 404 returned error can't find the container with id 4b3069958c5d1bda441f59a1f88d1a0abb1219cbfdef4d31e38ae244ed3306d0 Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.089984 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-njplj"] Feb 23 06:59:03 crc kubenswrapper[4626]: E0223 06:59:03.090309 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f759b0fe-50ac-4eb6-8539-c34dcf9cf501" containerName="keystone-db-sync" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.090321 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f759b0fe-50ac-4eb6-8539-c34dcf9cf501" containerName="keystone-db-sync" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.090483 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f759b0fe-50ac-4eb6-8539-c34dcf9cf501" containerName="keystone-db-sync" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.090936 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.102945 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-25kl2" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.110170 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.114639 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.114892 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.115005 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.123518 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cf6c888f-pj9gf"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.129979 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-njplj"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.152695 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-scripts\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.152744 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbq5\" (UniqueName: \"kubernetes.io/projected/58a8c613-a4a0-4760-ad05-1cd267388fc2-kube-api-access-xrbq5\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.152791 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-config-data\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.152835 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-credential-keys\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.152866 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-fernet-keys\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.152915 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-combined-ca-bundle\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.174561 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c578c478c-7przb"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.176295 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.179654 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c578c478c-7przb"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323011 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-config\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323072 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-credential-keys\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323123 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-fernet-keys\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323200 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-combined-ca-bundle\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323226 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9cs\" (UniqueName: \"kubernetes.io/projected/28cb7a2b-38cf-4b21-9d68-a550daf001f0-kube-api-access-zt9cs\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323273 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-svc\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323293 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323316 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323384 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-scripts\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323409 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbq5\" (UniqueName: \"kubernetes.io/projected/58a8c613-a4a0-4760-ad05-1cd267388fc2-kube-api-access-xrbq5\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323470 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.323521 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-config-data\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.336539 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-config-data\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.343152 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-credential-keys\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.345017 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-scripts\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.347956 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-combined-ca-bundle\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.354932 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-fernet-keys\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.375337 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fttdm"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.377046 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.380196 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-x72r5" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.380418 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.390464 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbq5\" (UniqueName: \"kubernetes.io/projected/58a8c613-a4a0-4760-ad05-1cd267388fc2-kube-api-access-xrbq5\") pod \"keystone-bootstrap-njplj\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.393545 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fttdm"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.416027 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b5d74467f-dcfkd"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.417412 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.426847 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427045 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427158 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427260 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jdcqr" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427620 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9cs\" (UniqueName: \"kubernetes.io/projected/28cb7a2b-38cf-4b21-9d68-a550daf001f0-kube-api-access-zt9cs\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427690 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-svc\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427719 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427736 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427839 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.427914 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-config\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.428742 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-config\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.429424 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-svc\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.429907 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.430433 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.430794 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.431298 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.435559 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b5d74467f-dcfkd"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.454209 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9cs\" (UniqueName: \"kubernetes.io/projected/28cb7a2b-38cf-4b21-9d68-a550daf001f0-kube-api-access-zt9cs\") pod \"dnsmasq-dns-5c578c478c-7przb\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.497052 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.518051 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7stdk"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.527144 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.528802 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-config-data\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.528892 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-544ls\" (UniqueName: \"kubernetes.io/projected/b40e0f9f-9c08-4344-ad79-47d1663967b9-kube-api-access-544ls\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.528941 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-config-data\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.528978 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-scripts\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.529002 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b40e0f9f-9c08-4344-ad79-47d1663967b9-horizon-secret-key\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.529024 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40e0f9f-9c08-4344-ad79-47d1663967b9-logs\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.529069 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-combined-ca-bundle\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.529090 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqdg\" (UniqueName: \"kubernetes.io/projected/c1806f1a-08dd-4b17-a799-1122348a4ab3-kube-api-access-dnqdg\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.534859 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dglfj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.535070 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.542631 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-47npm"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.546525 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.547369 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.555345 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.555424 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.555591 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x6xqg" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.572742 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jp5n9"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.573759 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.592540 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7stdk"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.599023 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.599172 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-828v2" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.605799 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.606249 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jp5n9"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642419 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-config-data\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642691 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-combined-ca-bundle\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642725 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-db-sync-config-data\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642756 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-etc-machine-id\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642787 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-scripts\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642854 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-544ls\" (UniqueName: \"kubernetes.io/projected/b40e0f9f-9c08-4344-ad79-47d1663967b9-kube-api-access-544ls\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642886 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-config-data\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642932 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-config-data\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642969 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlxl\" (UniqueName: \"kubernetes.io/projected/a203dc9f-43a6-4cf4-ac68-7c5125053cba-kube-api-access-4nlxl\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.642993 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-scripts\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.643014 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nmz\" (UniqueName: \"kubernetes.io/projected/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-kube-api-access-k4nmz\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.643031 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-scripts\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.643047 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b40e0f9f-9c08-4344-ad79-47d1663967b9-horizon-secret-key\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.644808 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-config-data\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.644882 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-47npm"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.659421 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-scripts\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662032 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-combined-ca-bundle\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662076 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40e0f9f-9c08-4344-ad79-47d1663967b9-logs\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662113 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a203dc9f-43a6-4cf4-ac68-7c5125053cba-logs\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662158 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-config-data\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662187 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-combined-ca-bundle\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662214 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqdg\" (UniqueName: \"kubernetes.io/projected/c1806f1a-08dd-4b17-a799-1122348a4ab3-kube-api-access-dnqdg\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.662530 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40e0f9f-9c08-4344-ad79-47d1663967b9-logs\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.665220 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b40e0f9f-9c08-4344-ad79-47d1663967b9-horizon-secret-key\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.666407 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-config-data\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.670236 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-combined-ca-bundle\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.683276 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-544ls\" (UniqueName: \"kubernetes.io/projected/b40e0f9f-9c08-4344-ad79-47d1663967b9-kube-api-access-544ls\") pod \"horizon-7b5d74467f-dcfkd\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.703942 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqdg\" (UniqueName: \"kubernetes.io/projected/c1806f1a-08dd-4b17-a799-1122348a4ab3-kube-api-access-dnqdg\") pod \"heat-db-sync-fttdm\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.731104 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67fd4f7c55-mmtls"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.732588 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763775 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-etc-machine-id\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763816 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-scripts\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763848 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-combined-ca-bundle\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763868 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mtw5\" (UniqueName: \"kubernetes.io/projected/81f3630a-a5f4-4a54-91e3-e6764673beca-kube-api-access-5mtw5\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763903 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-config-data\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763922 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-horizon-secret-key\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763973 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nlxl\" (UniqueName: \"kubernetes.io/projected/a203dc9f-43a6-4cf4-ac68-7c5125053cba-kube-api-access-4nlxl\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.763992 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmpv\" (UniqueName: \"kubernetes.io/projected/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-kube-api-access-8kmpv\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764007 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-scripts\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764026 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4nmz\" (UniqueName: \"kubernetes.io/projected/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-kube-api-access-k4nmz\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764044 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-config\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764060 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-combined-ca-bundle\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764084 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-config-data\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764099 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a203dc9f-43a6-4cf4-ac68-7c5125053cba-logs\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764125 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-config-data\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764154 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-logs\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764174 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-scripts\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764190 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-combined-ca-bundle\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.764206 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-db-sync-config-data\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.767633 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-etc-machine-id\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.768202 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c578c478c-7przb"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.768807 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-scripts\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.769340 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a203dc9f-43a6-4cf4-ac68-7c5125053cba-logs\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.775985 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-combined-ca-bundle\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.782466 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-config-data\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.783124 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-scripts\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.784065 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-config-data\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.783526 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-db-sync-config-data\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.792449 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-combined-ca-bundle\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.793852 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fttdm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.796411 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67fd4f7c55-mmtls"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.811256 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.814019 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nxxgl"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.816279 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.825904 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cjbtp" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.826172 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.828663 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4nmz\" (UniqueName: \"kubernetes.io/projected/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-kube-api-access-k4nmz\") pod \"cinder-db-sync-47npm\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.853795 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b7cb4c7b9-nj7qj"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.857211 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nlxl\" (UniqueName: \"kubernetes.io/projected/a203dc9f-43a6-4cf4-ac68-7c5125053cba-kube-api-access-4nlxl\") pod \"placement-db-sync-7stdk\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.858896 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.867625 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-combined-ca-bundle\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.867692 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mtw5\" (UniqueName: \"kubernetes.io/projected/81f3630a-a5f4-4a54-91e3-e6764673beca-kube-api-access-5mtw5\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.867862 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-horizon-secret-key\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.867956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmpv\" (UniqueName: \"kubernetes.io/projected/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-kube-api-access-8kmpv\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.868006 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-config\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.868052 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-config-data\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.868163 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-logs\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.868191 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-scripts\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.869564 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-scripts\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.873117 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-logs\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.874897 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-config-data\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.875953 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nxxgl"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.899741 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-config\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.909055 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-47npm" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.909946 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmpv\" (UniqueName: \"kubernetes.io/projected/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-kube-api-access-8kmpv\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.912054 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-horizon-secret-key\") pod \"horizon-67fd4f7c55-mmtls\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.916903 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-combined-ca-bundle\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.919466 4626 generic.go:334] "Generic (PLEG): container finished" podID="ec1abdc0-a49f-4669-9444-f4eea88e1066" containerID="043cd0e1c2cb4440e22d02a0087f2fb58fd9490a4dcca04740edbdbf5e59d7f1" exitCode=0 Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.919527 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" event={"ID":"ec1abdc0-a49f-4669-9444-f4eea88e1066","Type":"ContainerDied","Data":"043cd0e1c2cb4440e22d02a0087f2fb58fd9490a4dcca04740edbdbf5e59d7f1"} Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.919557 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" event={"ID":"ec1abdc0-a49f-4669-9444-f4eea88e1066","Type":"ContainerStarted","Data":"4b3069958c5d1bda441f59a1f88d1a0abb1219cbfdef4d31e38ae244ed3306d0"} Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.944344 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mtw5\" (UniqueName: \"kubernetes.io/projected/81f3630a-a5f4-4a54-91e3-e6764673beca-kube-api-access-5mtw5\") pod \"neutron-db-sync-jp5n9\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.971598 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b7cb4c7b9-nj7qj"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.971966 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-db-sync-config-data\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.972077 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-sb\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.972106 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-nb\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.972168 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-config\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.972254 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwxkz\" (UniqueName: \"kubernetes.io/projected/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-kube-api-access-cwxkz\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.973140 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhfbw\" (UniqueName: \"kubernetes.io/projected/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-kube-api-access-jhfbw\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.973245 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-swift-storage-0\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.973445 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-combined-ca-bundle\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.973715 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-svc\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.977974 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:03 crc kubenswrapper[4626]: I0223 06:59:03.991044 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:03.993896 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:03.995161 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:03.995295 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zv7dz" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:03.995433 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.062639 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.062676 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079187 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwxkz\" (UniqueName: \"kubernetes.io/projected/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-kube-api-access-cwxkz\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079258 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhfbw\" (UniqueName: \"kubernetes.io/projected/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-kube-api-access-jhfbw\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079281 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-swift-storage-0\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079330 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-combined-ca-bundle\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079380 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-svc\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079402 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-db-sync-config-data\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079434 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-sb\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079459 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-nb\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.079492 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-config\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.080361 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-config\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.085229 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-svc\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.100096 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.100443 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.114913 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.116004 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.122638 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-swift-storage-0\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.123295 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-combined-ca-bundle\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.124081 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-sb\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.125889 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-nb\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.131856 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-db-sync-config-data\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.135843 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.154418 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwxkz\" (UniqueName: \"kubernetes.io/projected/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-kube-api-access-cwxkz\") pod \"dnsmasq-dns-b7cb4c7b9-nj7qj\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.160065 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhfbw\" (UniqueName: \"kubernetes.io/projected/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-kube-api-access-jhfbw\") pod \"barbican-db-sync-nxxgl\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.198691 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.199370 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.201418 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.204263 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvp9n\" (UniqueName: \"kubernetes.io/projected/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-kube-api-access-hvp9n\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.204392 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-logs\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.205710 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-run-httpd\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.205830 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-scripts\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.205893 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.205979 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-config-data\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206053 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206211 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206276 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-config-data\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206383 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-log-httpd\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206464 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206575 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxvrx\" (UniqueName: \"kubernetes.io/projected/f51ed1e0-07d1-462c-91e1-cc35aaead170-kube-api-access-wxvrx\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.206649 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.225011 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.225432 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-scripts\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.231187 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327540 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327612 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-config-data\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327665 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-log-httpd\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327711 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327735 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxvrx\" (UniqueName: \"kubernetes.io/projected/f51ed1e0-07d1-462c-91e1-cc35aaead170-kube-api-access-wxvrx\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327754 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327820 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327897 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-scripts\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327927 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvp9n\" (UniqueName: \"kubernetes.io/projected/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-kube-api-access-hvp9n\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327964 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-logs\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.327984 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-run-httpd\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.328042 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-scripts\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.328055 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.328072 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-config-data\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.328091 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.328621 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.351110 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-logs\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.355451 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-run-httpd\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.361039 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-log-httpd\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.367565 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.379981 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-scripts\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.391008 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-config-data\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.391603 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.395161 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-scripts\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.405085 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.406855 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.415076 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvp9n\" (UniqueName: \"kubernetes.io/projected/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-kube-api-access-hvp9n\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.416973 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.417086 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxvrx\" (UniqueName: \"kubernetes.io/projected/f51ed1e0-07d1-462c-91e1-cc35aaead170-kube-api-access-wxvrx\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.417790 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.451184 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-config-data\") pod \"glance-default-external-api-0\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.485625 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-njplj"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.493107 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.504832 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.506567 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.515916 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.518060 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.518237 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.578131 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c578c478c-7przb"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.586814 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fttdm"] Feb 23 06:59:04 crc kubenswrapper[4626]: W0223 06:59:04.611815 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28cb7a2b_38cf_4b21_9d68_a550daf001f0.slice/crio-923e63cf7aed6f2ec3269db9c6d5171efaf8c3b0885db07cb4d59768277c2ab2 WatchSource:0}: Error finding container 923e63cf7aed6f2ec3269db9c6d5171efaf8c3b0885db07cb4d59768277c2ab2: Status 404 returned error can't find the container with id 923e63cf7aed6f2ec3269db9c6d5171efaf8c3b0885db07cb4d59768277c2ab2 Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.665797 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.669680 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.669777 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.669887 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.669923 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.669956 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.669997 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.670056 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jns4\" (UniqueName: \"kubernetes.io/projected/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-kube-api-access-2jns4\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.670101 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.744091 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772418 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772470 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772521 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772568 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772631 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jns4\" (UniqueName: \"kubernetes.io/projected/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-kube-api-access-2jns4\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772666 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772753 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.772822 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.785031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.785259 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.786271 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.786511 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.803350 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.811906 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.813383 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jns4\" (UniqueName: \"kubernetes.io/projected/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-kube-api-access-2jns4\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.819060 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.867280 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b5d74467f-dcfkd"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.876943 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-svc\") pod \"ec1abdc0-a49f-4669-9444-f4eea88e1066\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.877067 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-config\") pod \"ec1abdc0-a49f-4669-9444-f4eea88e1066\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.877121 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-swift-storage-0\") pod \"ec1abdc0-a49f-4669-9444-f4eea88e1066\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.877163 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-sb\") pod \"ec1abdc0-a49f-4669-9444-f4eea88e1066\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.877313 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-nb\") pod \"ec1abdc0-a49f-4669-9444-f4eea88e1066\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.877349 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb872\" (UniqueName: \"kubernetes.io/projected/ec1abdc0-a49f-4669-9444-f4eea88e1066-kube-api-access-hb872\") pod \"ec1abdc0-a49f-4669-9444-f4eea88e1066\" (UID: \"ec1abdc0-a49f-4669-9444-f4eea88e1066\") " Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.880938 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1abdc0-a49f-4669-9444-f4eea88e1066-kube-api-access-hb872" (OuterVolumeSpecName: "kube-api-access-hb872") pod "ec1abdc0-a49f-4669-9444-f4eea88e1066" (UID: "ec1abdc0-a49f-4669-9444-f4eea88e1066"). InnerVolumeSpecName "kube-api-access-hb872". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.883853 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67fd4f7c55-mmtls"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.888560 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.895065 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec1abdc0-a49f-4669-9444-f4eea88e1066" (UID: "ec1abdc0-a49f-4669-9444-f4eea88e1066"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.898594 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec1abdc0-a49f-4669-9444-f4eea88e1066" (UID: "ec1abdc0-a49f-4669-9444-f4eea88e1066"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.932798 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ec1abdc0-a49f-4669-9444-f4eea88e1066" (UID: "ec1abdc0-a49f-4669-9444-f4eea88e1066"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.935057 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec1abdc0-a49f-4669-9444-f4eea88e1066" (UID: "ec1abdc0-a49f-4669-9444-f4eea88e1066"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.952223 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-47npm"] Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.962732 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-config" (OuterVolumeSpecName: "config") pod "ec1abdc0-a49f-4669-9444-f4eea88e1066" (UID: "ec1abdc0-a49f-4669-9444-f4eea88e1066"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.968169 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njplj" event={"ID":"58a8c613-a4a0-4760-ad05-1cd267388fc2","Type":"ContainerStarted","Data":"6209b813b4d475ba8bf98581450d37fa18b16e29c0ca127ea5be5ce2a3cebccc"} Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.968207 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njplj" event={"ID":"58a8c613-a4a0-4760-ad05-1cd267388fc2","Type":"ContainerStarted","Data":"61ea565b4dda760c01394793039a12f92b55d5e640dc2206022a27453bfb6ba3"} Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.978819 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c578c478c-7przb" event={"ID":"28cb7a2b-38cf-4b21-9d68-a550daf001f0","Type":"ContainerStarted","Data":"923e63cf7aed6f2ec3269db9c6d5171efaf8c3b0885db07cb4d59768277c2ab2"} Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.986920 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.986945 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb872\" (UniqueName: \"kubernetes.io/projected/ec1abdc0-a49f-4669-9444-f4eea88e1066-kube-api-access-hb872\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.986958 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.986970 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.986981 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.986991 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1abdc0-a49f-4669-9444-f4eea88e1066-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.987461 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-njplj" podStartSLOduration=1.9874440629999999 podStartE2EDuration="1.987444063s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:04.983749417 +0000 UTC m=+1097.323078683" watchObservedRunningTime="2026-02-23 06:59:04.987444063 +0000 UTC m=+1097.326773329" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.992868 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" event={"ID":"ec1abdc0-a49f-4669-9444-f4eea88e1066","Type":"ContainerDied","Data":"4b3069958c5d1bda441f59a1f88d1a0abb1219cbfdef4d31e38ae244ed3306d0"} Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.993014 4626 scope.go:117] "RemoveContainer" containerID="043cd0e1c2cb4440e22d02a0087f2fb58fd9490a4dcca04740edbdbf5e59d7f1" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.993187 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cf6c888f-pj9gf" Feb 23 06:59:04 crc kubenswrapper[4626]: I0223 06:59:04.995441 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fttdm" event={"ID":"c1806f1a-08dd-4b17-a799-1122348a4ab3","Type":"ContainerStarted","Data":"6b802228811fe0e6d94adfd5a621794d41cd33fea5320d97866d8337dd8e20d3"} Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.080928 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cf6c888f-pj9gf"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.095735 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65cf6c888f-pj9gf"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.139339 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.249583 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nxxgl"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.275438 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7stdk"] Feb 23 06:59:05 crc kubenswrapper[4626]: W0223 06:59:05.284247 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod498f5c1c_5f75_49e1_909a_e7ce904ebd9d.slice/crio-72b6a51bf784104dcdb1e1b09493666adca85d09636aec99ee126245582619d8 WatchSource:0}: Error finding container 72b6a51bf784104dcdb1e1b09493666adca85d09636aec99ee126245582619d8: Status 404 returned error can't find the container with id 72b6a51bf784104dcdb1e1b09493666adca85d09636aec99ee126245582619d8 Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.393883 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.417247 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jp5n9"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.451751 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b7cb4c7b9-nj7qj"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.586313 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.910742 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:05 crc kubenswrapper[4626]: I0223 06:59:05.999929 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1abdc0-a49f-4669-9444-f4eea88e1066" path="/var/lib/kubelet/pods/ec1abdc0-a49f-4669-9444-f4eea88e1066/volumes" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.027448 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f51ed1e0-07d1-462c-91e1-cc35aaead170","Type":"ContainerStarted","Data":"40c23985378c71cea05370f84aa5ba86d8ff71a08ad15df0247fb4d1c777dc61"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.031658 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5d74467f-dcfkd" event={"ID":"b40e0f9f-9c08-4344-ad79-47d1663967b9","Type":"ContainerStarted","Data":"4119ebbde0e31b880dece4174dc50a534a73f7eb80d56fa969861733bcefd739"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.035543 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35","Type":"ContainerStarted","Data":"8e0c2b77838f707f9dd1000102c2d5bd2b63b4d219f258702d48c06e48f86f50"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.037972 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nxxgl" event={"ID":"498f5c1c-5f75-49e1-909a-e7ce904ebd9d","Type":"ContainerStarted","Data":"72b6a51bf784104dcdb1e1b09493666adca85d09636aec99ee126245582619d8"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.047560 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd4f7c55-mmtls" event={"ID":"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f","Type":"ContainerStarted","Data":"86bb47b80555c5fe7492c3c794b4f979ebe539708cc9035943e911ae964211df"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.106793 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7stdk" event={"ID":"a203dc9f-43a6-4cf4-ac68-7c5125053cba","Type":"ContainerStarted","Data":"4fa4c2f5791ed6dfeafcca2e792fcf7571d9c50c9c1c9f2a571b97ef11a12136"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.168069 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.191224 4626 generic.go:334] "Generic (PLEG): container finished" podID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerID="8579c9d227d2efe45662504b8ce7546b5d19b652804abcc8283456760af039b2" exitCode=0 Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.191433 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" event={"ID":"6e24c5dc-1890-47c6-84c9-dfa3c3965c81","Type":"ContainerDied","Data":"8579c9d227d2efe45662504b8ce7546b5d19b652804abcc8283456760af039b2"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.191955 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" event={"ID":"6e24c5dc-1890-47c6-84c9-dfa3c3965c81","Type":"ContainerStarted","Data":"18499cfc10f61394d878ca8729bbdd2bcffef2bf6e9cc58520125d11a71965d7"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.219255 4626 generic.go:334] "Generic (PLEG): container finished" podID="28cb7a2b-38cf-4b21-9d68-a550daf001f0" containerID="5714f97292cf6985c4c1ccb9856a454e9a75145a238bb32a77743d1913b8df3d" exitCode=0 Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.219343 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c578c478c-7przb" event={"ID":"28cb7a2b-38cf-4b21-9d68-a550daf001f0","Type":"ContainerDied","Data":"5714f97292cf6985c4c1ccb9856a454e9a75145a238bb32a77743d1913b8df3d"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.344827 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.373834 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-47npm" event={"ID":"c5cc94ca-558e-4a2c-8d28-5aedbecb3090","Type":"ContainerStarted","Data":"3d0758074e2db9265400432017ff66539deb298128a7d170ba7cb48d029c0282"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.428309 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67fd4f7c55-mmtls"] Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.518150 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59898cd8f5-xpkhf"] Feb 23 06:59:06 crc kubenswrapper[4626]: E0223 06:59:06.518743 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1abdc0-a49f-4669-9444-f4eea88e1066" containerName="init" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.518761 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1abdc0-a49f-4669-9444-f4eea88e1066" containerName="init" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.518953 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1abdc0-a49f-4669-9444-f4eea88e1066" containerName="init" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.522064 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.568536 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b2634f7-a041-40b7-beb0-36e366627314-horizon-secret-key\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.568696 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-scripts\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.568810 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rch85\" (UniqueName: \"kubernetes.io/projected/7b2634f7-a041-40b7-beb0-36e366627314-kube-api-access-rch85\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.569146 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-config-data\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.569745 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2634f7-a041-40b7-beb0-36e366627314-logs\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.590278 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jp5n9" event={"ID":"81f3630a-a5f4-4a54-91e3-e6764673beca","Type":"ContainerStarted","Data":"165b5f3d9af6d08ff22d0934fda72534b4c88204b2d76378f1734acceafb7f84"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.590336 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jp5n9" event={"ID":"81f3630a-a5f4-4a54-91e3-e6764673beca","Type":"ContainerStarted","Data":"04e3561c207cfcec0429ead72ce136d88087beee3325f85e3a350e2f831d0147"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.595814 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59898cd8f5-xpkhf"] Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.600811 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fdce4eb3-6315-4bdb-af51-aaf680fe57c8","Type":"ContainerStarted","Data":"3893c533c3b5fd2ea37b63e2a9c14a3a04a2fcb6176a73252781519c7227b097"} Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.638284 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.675610 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jp5n9" podStartSLOduration=3.6755927269999997 podStartE2EDuration="3.675592727s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:06.60974349 +0000 UTC m=+1098.949072756" watchObservedRunningTime="2026-02-23 06:59:06.675592727 +0000 UTC m=+1099.014921992" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.688042 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b2634f7-a041-40b7-beb0-36e366627314-horizon-secret-key\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.688134 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-scripts\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.688160 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rch85\" (UniqueName: \"kubernetes.io/projected/7b2634f7-a041-40b7-beb0-36e366627314-kube-api-access-rch85\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.688289 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-config-data\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.688596 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2634f7-a041-40b7-beb0-36e366627314-logs\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.689165 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2634f7-a041-40b7-beb0-36e366627314-logs\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.691290 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-config-data\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.695261 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-scripts\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.713429 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b2634f7-a041-40b7-beb0-36e366627314-horizon-secret-key\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.734420 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rch85\" (UniqueName: \"kubernetes.io/projected/7b2634f7-a041-40b7-beb0-36e366627314-kube-api-access-rch85\") pod \"horizon-59898cd8f5-xpkhf\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.849551 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.893002 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.896618 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-swift-storage-0\") pod \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.896722 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt9cs\" (UniqueName: \"kubernetes.io/projected/28cb7a2b-38cf-4b21-9d68-a550daf001f0-kube-api-access-zt9cs\") pod \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.897158 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-config\") pod \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.897189 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-svc\") pod \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.897214 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-nb\") pod \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.897250 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-sb\") pod \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\" (UID: \"28cb7a2b-38cf-4b21-9d68-a550daf001f0\") " Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.924785 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cb7a2b-38cf-4b21-9d68-a550daf001f0-kube-api-access-zt9cs" (OuterVolumeSpecName: "kube-api-access-zt9cs") pod "28cb7a2b-38cf-4b21-9d68-a550daf001f0" (UID: "28cb7a2b-38cf-4b21-9d68-a550daf001f0"). InnerVolumeSpecName "kube-api-access-zt9cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.938234 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28cb7a2b-38cf-4b21-9d68-a550daf001f0" (UID: "28cb7a2b-38cf-4b21-9d68-a550daf001f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.947295 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28cb7a2b-38cf-4b21-9d68-a550daf001f0" (UID: "28cb7a2b-38cf-4b21-9d68-a550daf001f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.947755 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28cb7a2b-38cf-4b21-9d68-a550daf001f0" (UID: "28cb7a2b-38cf-4b21-9d68-a550daf001f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.951882 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-config" (OuterVolumeSpecName: "config") pod "28cb7a2b-38cf-4b21-9d68-a550daf001f0" (UID: "28cb7a2b-38cf-4b21-9d68-a550daf001f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.984797 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28cb7a2b-38cf-4b21-9d68-a550daf001f0" (UID: "28cb7a2b-38cf-4b21-9d68-a550daf001f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.999233 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.999256 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.999266 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.999276 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.999285 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28cb7a2b-38cf-4b21-9d68-a550daf001f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:06 crc kubenswrapper[4626]: I0223 06:59:06.999296 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt9cs\" (UniqueName: \"kubernetes.io/projected/28cb7a2b-38cf-4b21-9d68-a550daf001f0-kube-api-access-zt9cs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.640250 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" event={"ID":"6e24c5dc-1890-47c6-84c9-dfa3c3965c81","Type":"ContainerStarted","Data":"0edeaf807430d6d17d32e8a9f893fbccafc909cc904c36a4d892d919a528b6ff"} Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.653206 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c578c478c-7przb" event={"ID":"28cb7a2b-38cf-4b21-9d68-a550daf001f0","Type":"ContainerDied","Data":"923e63cf7aed6f2ec3269db9c6d5171efaf8c3b0885db07cb4d59768277c2ab2"} Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.653264 4626 scope.go:117] "RemoveContainer" containerID="5714f97292cf6985c4c1ccb9856a454e9a75145a238bb32a77743d1913b8df3d" Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.653453 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c578c478c-7przb" Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.740435 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59898cd8f5-xpkhf"] Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.893775 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c578c478c-7przb"] Feb 23 06:59:07 crc kubenswrapper[4626]: I0223 06:59:07.929742 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c578c478c-7przb"] Feb 23 06:59:08 crc kubenswrapper[4626]: I0223 06:59:08.051355 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cb7a2b-38cf-4b21-9d68-a550daf001f0" path="/var/lib/kubelet/pods/28cb7a2b-38cf-4b21-9d68-a550daf001f0/volumes" Feb 23 06:59:08 crc kubenswrapper[4626]: I0223 06:59:08.728481 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fdce4eb3-6315-4bdb-af51-aaf680fe57c8","Type":"ContainerStarted","Data":"9e8367043f12a1edc5158535150a9b16737af30449ce06d2a8a72aeea3021423"} Feb 23 06:59:08 crc kubenswrapper[4626]: I0223 06:59:08.736838 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f51ed1e0-07d1-462c-91e1-cc35aaead170","Type":"ContainerStarted","Data":"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80"} Feb 23 06:59:08 crc kubenswrapper[4626]: I0223 06:59:08.758933 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59898cd8f5-xpkhf" event={"ID":"7b2634f7-a041-40b7-beb0-36e366627314","Type":"ContainerStarted","Data":"d64251e4d7548754bd5c464a70b04207b29a39b0586369517ddd02d5032f50ac"} Feb 23 06:59:08 crc kubenswrapper[4626]: I0223 06:59:08.759091 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:08 crc kubenswrapper[4626]: I0223 06:59:08.778436 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" podStartSLOduration=5.778423512 podStartE2EDuration="5.778423512s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:08.773726066 +0000 UTC m=+1101.113055331" watchObservedRunningTime="2026-02-23 06:59:08.778423512 +0000 UTC m=+1101.117752769" Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.777451 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fdce4eb3-6315-4bdb-af51-aaf680fe57c8","Type":"ContainerStarted","Data":"b2258928ad77e8b9496b32bdcbd111edcde062af2a54a2de901aaed163b96066"} Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.778065 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-log" containerID="cri-o://9e8367043f12a1edc5158535150a9b16737af30449ce06d2a8a72aeea3021423" gracePeriod=30 Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.778660 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-httpd" containerID="cri-o://b2258928ad77e8b9496b32bdcbd111edcde062af2a54a2de901aaed163b96066" gracePeriod=30 Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.783234 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-log" containerID="cri-o://67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80" gracePeriod=30 Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.783284 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f51ed1e0-07d1-462c-91e1-cc35aaead170","Type":"ContainerStarted","Data":"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40"} Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.783326 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-httpd" containerID="cri-o://0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40" gracePeriod=30 Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.835965 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.835949832 podStartE2EDuration="6.835949832s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:09.835082828 +0000 UTC m=+1102.174412094" watchObservedRunningTime="2026-02-23 06:59:09.835949832 +0000 UTC m=+1102.175279099" Feb 23 06:59:09 crc kubenswrapper[4626]: I0223 06:59:09.875292 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.875271723 podStartE2EDuration="6.875271723s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:09.81399165 +0000 UTC m=+1102.153320917" watchObservedRunningTime="2026-02-23 06:59:09.875271723 +0000 UTC m=+1102.214600989" Feb 23 06:59:10 crc kubenswrapper[4626]: E0223 06:59:10.378276 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdce4eb3_6315_4bdb_af51_aaf680fe57c8.slice/crio-conmon-b2258928ad77e8b9496b32bdcbd111edcde062af2a54a2de901aaed163b96066.scope\": RecentStats: unable to find data in memory cache]" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.678262 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760356 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-logs\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760429 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-httpd-run\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760470 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxvrx\" (UniqueName: \"kubernetes.io/projected/f51ed1e0-07d1-462c-91e1-cc35aaead170-kube-api-access-wxvrx\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760560 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-scripts\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760601 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760643 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-combined-ca-bundle\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760690 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-config-data\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.760806 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-public-tls-certs\") pod \"f51ed1e0-07d1-462c-91e1-cc35aaead170\" (UID: \"f51ed1e0-07d1-462c-91e1-cc35aaead170\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.763576 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.765192 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-logs" (OuterVolumeSpecName: "logs") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.769847 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.778796 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-scripts" (OuterVolumeSpecName: "scripts") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.787052 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f51ed1e0-07d1-462c-91e1-cc35aaead170-kube-api-access-wxvrx" (OuterVolumeSpecName: "kube-api-access-wxvrx") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "kube-api-access-wxvrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.810292 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.823226 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-config-data" (OuterVolumeSpecName: "config-data") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.835582 4626 generic.go:334] "Generic (PLEG): container finished" podID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerID="b2258928ad77e8b9496b32bdcbd111edcde062af2a54a2de901aaed163b96066" exitCode=0 Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.835619 4626 generic.go:334] "Generic (PLEG): container finished" podID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerID="9e8367043f12a1edc5158535150a9b16737af30449ce06d2a8a72aeea3021423" exitCode=143 Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.835699 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fdce4eb3-6315-4bdb-af51-aaf680fe57c8","Type":"ContainerDied","Data":"b2258928ad77e8b9496b32bdcbd111edcde062af2a54a2de901aaed163b96066"} Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.835740 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fdce4eb3-6315-4bdb-af51-aaf680fe57c8","Type":"ContainerDied","Data":"9e8367043f12a1edc5158535150a9b16737af30449ce06d2a8a72aeea3021423"} Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.837857 4626 generic.go:334] "Generic (PLEG): container finished" podID="58a8c613-a4a0-4760-ad05-1cd267388fc2" containerID="6209b813b4d475ba8bf98581450d37fa18b16e29c0ca127ea5be5ce2a3cebccc" exitCode=0 Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.837900 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njplj" event={"ID":"58a8c613-a4a0-4760-ad05-1cd267388fc2","Type":"ContainerDied","Data":"6209b813b4d475ba8bf98581450d37fa18b16e29c0ca127ea5be5ce2a3cebccc"} Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.845393 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f51ed1e0-07d1-462c-91e1-cc35aaead170" (UID: "f51ed1e0-07d1-462c-91e1-cc35aaead170"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.854887 4626 generic.go:334] "Generic (PLEG): container finished" podID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerID="0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40" exitCode=0 Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.854924 4626 generic.go:334] "Generic (PLEG): container finished" podID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerID="67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80" exitCode=143 Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.854953 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f51ed1e0-07d1-462c-91e1-cc35aaead170","Type":"ContainerDied","Data":"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40"} Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.854958 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.854994 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f51ed1e0-07d1-462c-91e1-cc35aaead170","Type":"ContainerDied","Data":"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80"} Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.855005 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f51ed1e0-07d1-462c-91e1-cc35aaead170","Type":"ContainerDied","Data":"40c23985378c71cea05370f84aa5ba86d8ff71a08ad15df0247fb4d1c777dc61"} Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.855025 4626 scope.go:117] "RemoveContainer" containerID="0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.857960 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864196 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864294 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864354 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f51ed1e0-07d1-462c-91e1-cc35aaead170-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864405 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxvrx\" (UniqueName: \"kubernetes.io/projected/f51ed1e0-07d1-462c-91e1-cc35aaead170-kube-api-access-wxvrx\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864454 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864539 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864596 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.864639 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f51ed1e0-07d1-462c-91e1-cc35aaead170-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.882457 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.918678 4626 scope.go:117] "RemoveContainer" containerID="67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.963048 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966117 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-config-data\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966176 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-combined-ca-bundle\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966280 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-scripts\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966349 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-internal-tls-certs\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966606 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-httpd-run\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966743 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.966765 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-logs\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.967165 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jns4\" (UniqueName: \"kubernetes.io/projected/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-kube-api-access-2jns4\") pod \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\" (UID: \"fdce4eb3-6315-4bdb-af51-aaf680fe57c8\") " Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.968235 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.976452 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.979724 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.988257 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-logs" (OuterVolumeSpecName: "logs") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.992455 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.992625 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-scripts" (OuterVolumeSpecName: "scripts") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:10 crc kubenswrapper[4626]: I0223 06:59:10.998690 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-kube-api-access-2jns4" (OuterVolumeSpecName: "kube-api-access-2jns4") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "kube-api-access-2jns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.023823 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.024229 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-log" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024250 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-log" Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.024265 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-log" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024272 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-log" Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.024282 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-httpd" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024289 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-httpd" Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.024298 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-httpd" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024304 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-httpd" Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.024316 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cb7a2b-38cf-4b21-9d68-a550daf001f0" containerName="init" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024321 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cb7a2b-38cf-4b21-9d68-a550daf001f0" containerName="init" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024551 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-httpd" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024571 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cb7a2b-38cf-4b21-9d68-a550daf001f0" containerName="init" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024584 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-log" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024600 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" containerName="glance-log" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.024609 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" containerName="glance-httpd" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.025433 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.027860 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.028035 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.054539 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.062424 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075019 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075123 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075170 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-scripts\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075190 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-config-data\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075230 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wcm\" (UniqueName: \"kubernetes.io/projected/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-kube-api-access-n6wcm\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075347 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075370 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075410 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-logs\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075548 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075565 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075575 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075593 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075602 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.075611 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jns4\" (UniqueName: \"kubernetes.io/projected/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-kube-api-access-2jns4\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.097724 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.104270 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-config-data" (OuterVolumeSpecName: "config-data") pod "fdce4eb3-6315-4bdb-af51-aaf680fe57c8" (UID: "fdce4eb3-6315-4bdb-af51-aaf680fe57c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.114414 4626 scope.go:117] "RemoveContainer" containerID="0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.120792 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.122653 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40\": container with ID starting with 0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40 not found: ID does not exist" containerID="0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.122689 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40"} err="failed to get container status \"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40\": rpc error: code = NotFound desc = could not find container \"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40\": container with ID starting with 0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40 not found: ID does not exist" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.122724 4626 scope.go:117] "RemoveContainer" containerID="67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80" Feb 23 06:59:11 crc kubenswrapper[4626]: E0223 06:59:11.123572 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80\": container with ID starting with 67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80 not found: ID does not exist" containerID="67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.123596 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80"} err="failed to get container status \"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80\": rpc error: code = NotFound desc = could not find container \"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80\": container with ID starting with 67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80 not found: ID does not exist" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.123611 4626 scope.go:117] "RemoveContainer" containerID="0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.125147 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40"} err="failed to get container status \"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40\": rpc error: code = NotFound desc = could not find container \"0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40\": container with ID starting with 0eacafbaf6eed5549a56f441892f1bb9662b6270a26cce85db64b57ed3641a40 not found: ID does not exist" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.125169 4626 scope.go:117] "RemoveContainer" containerID="67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.130137 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80"} err="failed to get container status \"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80\": rpc error: code = NotFound desc = could not find container \"67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80\": container with ID starting with 67ac774c40949da1d9742660aa11af7e6cfad6e6b703794003731266f2b66f80 not found: ID does not exist" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177547 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177584 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177609 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-logs\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177676 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177744 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177778 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-scripts\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177799 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-config-data\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177834 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6wcm\" (UniqueName: \"kubernetes.io/projected/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-kube-api-access-n6wcm\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177941 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177959 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.177967 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce4eb3-6315-4bdb-af51-aaf680fe57c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.178132 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.178585 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.179777 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-logs\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.191595 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.193398 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-config-data\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.195009 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.195046 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-scripts\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.197928 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6wcm\" (UniqueName: \"kubernetes.io/projected/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-kube-api-access-n6wcm\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.210183 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.411078 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.876855 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fdce4eb3-6315-4bdb-af51-aaf680fe57c8","Type":"ContainerDied","Data":"3893c533c3b5fd2ea37b63e2a9c14a3a04a2fcb6176a73252781519c7227b097"} Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.877247 4626 scope.go:117] "RemoveContainer" containerID="b2258928ad77e8b9496b32bdcbd111edcde062af2a54a2de901aaed163b96066" Feb 23 06:59:11 crc kubenswrapper[4626]: I0223 06:59:11.876922 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.012804 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f51ed1e0-07d1-462c-91e1-cc35aaead170" path="/var/lib/kubelet/pods/f51ed1e0-07d1-462c-91e1-cc35aaead170/volumes" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.013527 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.022205 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.041692 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.043003 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.053990 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.054318 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.066824 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.079051 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.132360 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-logs\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.132422 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-config-data\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.132620 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.132790 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.132837 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-scripts\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.132910 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4hxp\" (UniqueName: \"kubernetes.io/projected/794ef965-1710-40d6-93ce-fc78c9799816-kube-api-access-j4hxp\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.133149 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.133310 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235245 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235324 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235351 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-logs\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235374 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-config-data\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235470 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235581 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235605 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-scripts\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.235642 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4hxp\" (UniqueName: \"kubernetes.io/projected/794ef965-1710-40d6-93ce-fc78c9799816-kube-api-access-j4hxp\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.237624 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.238371 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-logs\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.239145 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.242554 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.250438 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.255421 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-config-data\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.257485 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4hxp\" (UniqueName: \"kubernetes.io/projected/794ef965-1710-40d6-93ce-fc78c9799816-kube-api-access-j4hxp\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.261741 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-scripts\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.277707 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:12 crc kubenswrapper[4626]: I0223 06:59:12.363801 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.244096 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5d74467f-dcfkd"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.258050 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9c5c7b856-snkxr"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.259869 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.270318 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.270569 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.291225 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9c5c7b856-snkxr"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.387147 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59898cd8f5-xpkhf"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394135 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-scripts\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394258 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-combined-ca-bundle\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394342 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt9ql\" (UniqueName: \"kubernetes.io/projected/8624d986-dff6-40bd-937d-755c2ca809d9-kube-api-access-lt9ql\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394469 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-tls-certs\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394531 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8624d986-dff6-40bd-937d-755c2ca809d9-logs\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394674 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-secret-key\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.394803 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-config-data\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.415210 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.447064 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-688bccf86-4crkw"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.455795 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.466305 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-688bccf86-4crkw"] Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.499922 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-secret-key\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.500065 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-config-data\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.500200 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-scripts\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.500285 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-combined-ca-bundle\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.500355 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt9ql\" (UniqueName: \"kubernetes.io/projected/8624d986-dff6-40bd-937d-755c2ca809d9-kube-api-access-lt9ql\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.500564 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-tls-certs\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.500599 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8624d986-dff6-40bd-937d-755c2ca809d9-logs\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.501096 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8624d986-dff6-40bd-937d-755c2ca809d9-logs\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.504246 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-scripts\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.504526 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-config-data\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.507662 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-secret-key\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.515077 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-combined-ca-bundle\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.519028 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt9ql\" (UniqueName: \"kubernetes.io/projected/8624d986-dff6-40bd-937d-755c2ca809d9-kube-api-access-lt9ql\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.528778 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-tls-certs\") pod \"horizon-9c5c7b856-snkxr\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.602285 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e1e535-58de-4987-9d93-65fb6d4c9409-scripts\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.602576 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e1e535-58de-4987-9d93-65fb6d4c9409-config-data\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.602671 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-horizon-tls-certs\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.603029 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-horizon-secret-key\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.603065 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lk74\" (UniqueName: \"kubernetes.io/projected/d3e1e535-58de-4987-9d93-65fb6d4c9409-kube-api-access-9lk74\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.603105 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-combined-ca-bundle\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.603303 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e1e535-58de-4987-9d93-65fb6d4c9409-logs\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.633574 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.704770 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e1e535-58de-4987-9d93-65fb6d4c9409-logs\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.704855 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e1e535-58de-4987-9d93-65fb6d4c9409-scripts\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.704883 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e1e535-58de-4987-9d93-65fb6d4c9409-config-data\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.704922 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-horizon-tls-certs\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.705007 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-horizon-secret-key\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.705028 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lk74\" (UniqueName: \"kubernetes.io/projected/d3e1e535-58de-4987-9d93-65fb6d4c9409-kube-api-access-9lk74\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.705056 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-combined-ca-bundle\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.707513 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e1e535-58de-4987-9d93-65fb6d4c9409-scripts\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.707517 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e1e535-58de-4987-9d93-65fb6d4c9409-logs\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.708108 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e1e535-58de-4987-9d93-65fb6d4c9409-config-data\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.720363 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-combined-ca-bundle\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.721399 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-horizon-tls-certs\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.725815 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lk74\" (UniqueName: \"kubernetes.io/projected/d3e1e535-58de-4987-9d93-65fb6d4c9409-kube-api-access-9lk74\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.726207 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e1e535-58de-4987-9d93-65fb6d4c9409-horizon-secret-key\") pod \"horizon-688bccf86-4crkw\" (UID: \"d3e1e535-58de-4987-9d93-65fb6d4c9409\") " pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.782077 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:13 crc kubenswrapper[4626]: I0223 06:59:13.996379 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdce4eb3-6315-4bdb-af51-aaf680fe57c8" path="/var/lib/kubelet/pods/fdce4eb3-6315-4bdb-af51-aaf680fe57c8/volumes" Feb 23 06:59:14 crc kubenswrapper[4626]: I0223 06:59:14.203989 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 06:59:14 crc kubenswrapper[4626]: I0223 06:59:14.276358 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99b585b75-58xkk"] Feb 23 06:59:14 crc kubenswrapper[4626]: I0223 06:59:14.276651 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-99b585b75-58xkk" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" containerID="cri-o://f174a150e2af866afa793084424f2b440d6fce9475cb386254854f53188be4db" gracePeriod=10 Feb 23 06:59:14 crc kubenswrapper[4626]: I0223 06:59:14.951058 4626 generic.go:334] "Generic (PLEG): container finished" podID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerID="f174a150e2af866afa793084424f2b440d6fce9475cb386254854f53188be4db" exitCode=0 Feb 23 06:59:14 crc kubenswrapper[4626]: I0223 06:59:14.951170 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99b585b75-58xkk" event={"ID":"1c870a4f-2c97-4836-9522-fd73c0a9d3ef","Type":"ContainerDied","Data":"f174a150e2af866afa793084424f2b440d6fce9475cb386254854f53188be4db"} Feb 23 06:59:16 crc kubenswrapper[4626]: I0223 06:59:16.977536 4626 generic.go:334] "Generic (PLEG): container finished" podID="81f3630a-a5f4-4a54-91e3-e6764673beca" containerID="165b5f3d9af6d08ff22d0934fda72534b4c88204b2d76378f1734acceafb7f84" exitCode=0 Feb 23 06:59:16 crc kubenswrapper[4626]: I0223 06:59:16.977686 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jp5n9" event={"ID":"81f3630a-a5f4-4a54-91e3-e6764673beca","Type":"ContainerDied","Data":"165b5f3d9af6d08ff22d0934fda72534b4c88204b2d76378f1734acceafb7f84"} Feb 23 06:59:19 crc kubenswrapper[4626]: I0223 06:59:19.174014 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-99b585b75-58xkk" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.551524 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.718829 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrbq5\" (UniqueName: \"kubernetes.io/projected/58a8c613-a4a0-4760-ad05-1cd267388fc2-kube-api-access-xrbq5\") pod \"58a8c613-a4a0-4760-ad05-1cd267388fc2\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.719064 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-credential-keys\") pod \"58a8c613-a4a0-4760-ad05-1cd267388fc2\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.719145 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-combined-ca-bundle\") pod \"58a8c613-a4a0-4760-ad05-1cd267388fc2\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.719300 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-scripts\") pod \"58a8c613-a4a0-4760-ad05-1cd267388fc2\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.719336 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-config-data\") pod \"58a8c613-a4a0-4760-ad05-1cd267388fc2\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.719373 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-fernet-keys\") pod \"58a8c613-a4a0-4760-ad05-1cd267388fc2\" (UID: \"58a8c613-a4a0-4760-ad05-1cd267388fc2\") " Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.739572 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "58a8c613-a4a0-4760-ad05-1cd267388fc2" (UID: "58a8c613-a4a0-4760-ad05-1cd267388fc2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.740955 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-scripts" (OuterVolumeSpecName: "scripts") pod "58a8c613-a4a0-4760-ad05-1cd267388fc2" (UID: "58a8c613-a4a0-4760-ad05-1cd267388fc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.741244 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "58a8c613-a4a0-4760-ad05-1cd267388fc2" (UID: "58a8c613-a4a0-4760-ad05-1cd267388fc2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.741444 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a8c613-a4a0-4760-ad05-1cd267388fc2-kube-api-access-xrbq5" (OuterVolumeSpecName: "kube-api-access-xrbq5") pod "58a8c613-a4a0-4760-ad05-1cd267388fc2" (UID: "58a8c613-a4a0-4760-ad05-1cd267388fc2"). InnerVolumeSpecName "kube-api-access-xrbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.745995 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58a8c613-a4a0-4760-ad05-1cd267388fc2" (UID: "58a8c613-a4a0-4760-ad05-1cd267388fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.769858 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-config-data" (OuterVolumeSpecName: "config-data") pod "58a8c613-a4a0-4760-ad05-1cd267388fc2" (UID: "58a8c613-a4a0-4760-ad05-1cd267388fc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.822266 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.822296 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.822305 4626 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.822317 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrbq5\" (UniqueName: \"kubernetes.io/projected/58a8c613-a4a0-4760-ad05-1cd267388fc2-kube-api-access-xrbq5\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.822330 4626 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:22 crc kubenswrapper[4626]: I0223 06:59:22.822342 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a8c613-a4a0-4760-ad05-1cd267388fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.037210 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"38f3c8f5-dca2-4b37-81cb-6d25edbf1035","Type":"ContainerStarted","Data":"6f1c450013d3b52c28e04b567b136a2c31a1add052afabd57c9cf511f32bea3b"} Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.039410 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-njplj" event={"ID":"58a8c613-a4a0-4760-ad05-1cd267388fc2","Type":"ContainerDied","Data":"61ea565b4dda760c01394793039a12f92b55d5e640dc2206022a27453bfb6ba3"} Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.039452 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61ea565b4dda760c01394793039a12f92b55d5e640dc2206022a27453bfb6ba3" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.039465 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-njplj" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.647867 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-njplj"] Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.657195 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-njplj"] Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.738705 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6x8cq"] Feb 23 06:59:23 crc kubenswrapper[4626]: E0223 06:59:23.739361 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a8c613-a4a0-4760-ad05-1cd267388fc2" containerName="keystone-bootstrap" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.739381 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a8c613-a4a0-4760-ad05-1cd267388fc2" containerName="keystone-bootstrap" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.739743 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a8c613-a4a0-4760-ad05-1cd267388fc2" containerName="keystone-bootstrap" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.740541 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.744157 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-credential-keys\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.744246 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjb5n\" (UniqueName: \"kubernetes.io/projected/99407a1e-403e-4460-8bff-8eb644010b4c-kube-api-access-bjb5n\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.744303 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-combined-ca-bundle\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.744345 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-scripts\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.744403 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-config-data\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.744492 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-fernet-keys\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.746796 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6x8cq"] Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.747543 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.747625 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.748347 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.748582 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.748815 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-25kl2" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.846195 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-fernet-keys\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.846356 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-credential-keys\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.846400 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjb5n\" (UniqueName: \"kubernetes.io/projected/99407a1e-403e-4460-8bff-8eb644010b4c-kube-api-access-bjb5n\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.846428 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-combined-ca-bundle\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.846459 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-scripts\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.846493 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-config-data\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.854373 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-credential-keys\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.854406 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-scripts\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.854480 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-fernet-keys\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.854664 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-config-data\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.862221 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-combined-ca-bundle\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.867935 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjb5n\" (UniqueName: \"kubernetes.io/projected/99407a1e-403e-4460-8bff-8eb644010b4c-kube-api-access-bjb5n\") pod \"keystone-bootstrap-6x8cq\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:23 crc kubenswrapper[4626]: I0223 06:59:23.994371 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a8c613-a4a0-4760-ad05-1cd267388fc2" path="/var/lib/kubelet/pods/58a8c613-a4a0-4760-ad05-1cd267388fc2/volumes" Feb 23 06:59:24 crc kubenswrapper[4626]: I0223 06:59:24.069238 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:27 crc kubenswrapper[4626]: E0223 06:59:27.289304 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:27 crc kubenswrapper[4626]: E0223 06:59:27.289716 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:27 crc kubenswrapper[4626]: E0223 06:59:27.289902 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n548h598h66chf7h7fh5bh9fh657h5c7h59ch569h59bh596h78h54ch57fh5d7hb8h587h86hc6hfchc6hc5h69h5ch54fh654h549h59h5ddh76q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kmpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-67fd4f7c55-mmtls_openstack(bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:59:27 crc kubenswrapper[4626]: E0223 06:59:27.292043 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb\\\"\"]" pod="openstack/horizon-67fd4f7c55-mmtls" podUID="bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" Feb 23 06:59:28 crc kubenswrapper[4626]: E0223 06:59:28.814976 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:28 crc kubenswrapper[4626]: E0223 06:59:28.815306 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:28 crc kubenswrapper[4626]: E0223 06:59:28.815458 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n554h5h588h56h676h688h5fch657h95hdbhdch66ch5b9h5cdh566h689h54ch58bh5f5h95h8fh688h5cdh6fh8bh69h58dh58fh5c7h59h65fhc4q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-544ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7b5d74467f-dcfkd_openstack(b40e0f9f-9c08-4344-ad79-47d1663967b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:59:28 crc kubenswrapper[4626]: E0223 06:59:28.819239 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8419493e1fd846703d277695e03fc5eb\\\"\"]" pod="openstack/horizon-7b5d74467f-dcfkd" podUID="b40e0f9f-9c08-4344-ad79-47d1663967b9" Feb 23 06:59:28 crc kubenswrapper[4626]: I0223 06:59:28.875975 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.065559 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mtw5\" (UniqueName: \"kubernetes.io/projected/81f3630a-a5f4-4a54-91e3-e6764673beca-kube-api-access-5mtw5\") pod \"81f3630a-a5f4-4a54-91e3-e6764673beca\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.065675 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-combined-ca-bundle\") pod \"81f3630a-a5f4-4a54-91e3-e6764673beca\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.065720 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-config\") pod \"81f3630a-a5f4-4a54-91e3-e6764673beca\" (UID: \"81f3630a-a5f4-4a54-91e3-e6764673beca\") " Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.085198 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f3630a-a5f4-4a54-91e3-e6764673beca-kube-api-access-5mtw5" (OuterVolumeSpecName: "kube-api-access-5mtw5") pod "81f3630a-a5f4-4a54-91e3-e6764673beca" (UID: "81f3630a-a5f4-4a54-91e3-e6764673beca"). InnerVolumeSpecName "kube-api-access-5mtw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.094536 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81f3630a-a5f4-4a54-91e3-e6764673beca" (UID: "81f3630a-a5f4-4a54-91e3-e6764673beca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.094986 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-config" (OuterVolumeSpecName: "config") pod "81f3630a-a5f4-4a54-91e3-e6764673beca" (UID: "81f3630a-a5f4-4a54-91e3-e6764673beca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.104063 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jp5n9" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.108965 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jp5n9" event={"ID":"81f3630a-a5f4-4a54-91e3-e6764673beca","Type":"ContainerDied","Data":"04e3561c207cfcec0429ead72ce136d88087beee3325f85e3a350e2f831d0147"} Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.109058 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04e3561c207cfcec0429ead72ce136d88087beee3325f85e3a350e2f831d0147" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.169716 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.171445 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/81f3630a-a5f4-4a54-91e3-e6764673beca-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.171480 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mtw5\" (UniqueName: \"kubernetes.io/projected/81f3630a-a5f4-4a54-91e3-e6764673beca-kube-api-access-5mtw5\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:29 crc kubenswrapper[4626]: I0223 06:59:29.175286 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-99b585b75-58xkk" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.133980 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd9b88b4f-724lk"] Feb 23 06:59:30 crc kubenswrapper[4626]: E0223 06:59:30.134677 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f3630a-a5f4-4a54-91e3-e6764673beca" containerName="neutron-db-sync" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.134695 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f3630a-a5f4-4a54-91e3-e6764673beca" containerName="neutron-db-sync" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.134906 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f3630a-a5f4-4a54-91e3-e6764673beca" containerName="neutron-db-sync" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.135848 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.150043 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd9b88b4f-724lk"] Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.301035 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-svc\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.301108 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.301158 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.301189 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8bq5\" (UniqueName: \"kubernetes.io/projected/8dcb0164-5b5e-456b-9918-f42483a73601-kube-api-access-l8bq5\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.301265 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-config\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.301395 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.374432 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-668d74f4c6-dk9gw"] Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.377018 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.381156 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.381311 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.381631 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.381779 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-828v2" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.402417 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-668d74f4c6-dk9gw"] Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.403381 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.403461 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-svc\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.403521 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.403546 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.403570 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8bq5\" (UniqueName: \"kubernetes.io/projected/8dcb0164-5b5e-456b-9918-f42483a73601-kube-api-access-l8bq5\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.403620 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-config\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.404675 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-config\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.405188 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.405445 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.405837 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.406529 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-svc\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.429705 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8bq5\" (UniqueName: \"kubernetes.io/projected/8dcb0164-5b5e-456b-9918-f42483a73601-kube-api-access-l8bq5\") pod \"dnsmasq-dns-7bd9b88b4f-724lk\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.459198 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.507397 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-config\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.507590 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-httpd-config\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.507705 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82v58\" (UniqueName: \"kubernetes.io/projected/d0643197-bfab-42f2-bfef-85ab66daf967-kube-api-access-82v58\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.507784 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-combined-ca-bundle\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.507959 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-ovndb-tls-certs\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.610078 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-config\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.610124 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-httpd-config\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.610221 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82v58\" (UniqueName: \"kubernetes.io/projected/d0643197-bfab-42f2-bfef-85ab66daf967-kube-api-access-82v58\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.610278 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-combined-ca-bundle\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.610325 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-ovndb-tls-certs\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.616167 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-ovndb-tls-certs\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.617859 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-combined-ca-bundle\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.625817 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-config\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.685288 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82v58\" (UniqueName: \"kubernetes.io/projected/d0643197-bfab-42f2-bfef-85ab66daf967-kube-api-access-82v58\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.698471 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-httpd-config\") pod \"neutron-668d74f4c6-dk9gw\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:30 crc kubenswrapper[4626]: I0223 06:59:30.992654 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.410745 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-848f958bdf-dbqk8"] Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.420259 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.423491 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.423662 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.431684 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-848f958bdf-dbqk8"] Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574471 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl4bs\" (UniqueName: \"kubernetes.io/projected/68c3a490-db52-4fed-baf1-07c3cf9b06bc-kube-api-access-dl4bs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574620 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-httpd-config\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574682 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-combined-ca-bundle\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574760 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-public-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574791 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-config\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574811 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-ovndb-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.574849 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-internal-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678075 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl4bs\" (UniqueName: \"kubernetes.io/projected/68c3a490-db52-4fed-baf1-07c3cf9b06bc-kube-api-access-dl4bs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678299 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-httpd-config\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678400 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-combined-ca-bundle\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678492 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-public-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678544 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-config\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678568 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-ovndb-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.678622 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-internal-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.684849 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-config\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.686886 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-ovndb-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.694274 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-httpd-config\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.697103 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-internal-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.704163 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-combined-ca-bundle\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.715124 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl4bs\" (UniqueName: \"kubernetes.io/projected/68c3a490-db52-4fed-baf1-07c3cf9b06bc-kube-api-access-dl4bs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.715309 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-public-tls-certs\") pod \"neutron-848f958bdf-dbqk8\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:32 crc kubenswrapper[4626]: I0223 06:59:32.740330 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:34 crc kubenswrapper[4626]: I0223 06:59:34.182446 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-99b585b75-58xkk" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Feb 23 06:59:34 crc kubenswrapper[4626]: I0223 06:59:34.183710 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.327193 4626 scope.go:117] "RemoveContainer" containerID="9e8367043f12a1edc5158535150a9b16737af30449ce06d2a8a72aeea3021423" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.444586 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.591801 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-dns-svc\") pod \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.591982 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-config\") pod \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.592492 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-sb\") pod \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.592546 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-nb\") pod \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.592659 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clrtd\" (UniqueName: \"kubernetes.io/projected/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-kube-api-access-clrtd\") pod \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\" (UID: \"1c870a4f-2c97-4836-9522-fd73c0a9d3ef\") " Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.615100 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-kube-api-access-clrtd" (OuterVolumeSpecName: "kube-api-access-clrtd") pod "1c870a4f-2c97-4836-9522-fd73c0a9d3ef" (UID: "1c870a4f-2c97-4836-9522-fd73c0a9d3ef"). InnerVolumeSpecName "kube-api-access-clrtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.630893 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c870a4f-2c97-4836-9522-fd73c0a9d3ef" (UID: "1c870a4f-2c97-4836-9522-fd73c0a9d3ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.631612 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-config" (OuterVolumeSpecName: "config") pod "1c870a4f-2c97-4836-9522-fd73c0a9d3ef" (UID: "1c870a4f-2c97-4836-9522-fd73c0a9d3ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.633607 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c870a4f-2c97-4836-9522-fd73c0a9d3ef" (UID: "1c870a4f-2c97-4836-9522-fd73c0a9d3ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.642692 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c870a4f-2c97-4836-9522-fd73c0a9d3ef" (UID: "1c870a4f-2c97-4836-9522-fd73c0a9d3ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.695240 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clrtd\" (UniqueName: \"kubernetes.io/projected/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-kube-api-access-clrtd\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.695269 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.695279 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.695288 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.695296 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c870a4f-2c97-4836-9522-fd73c0a9d3ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:37 crc kubenswrapper[4626]: E0223 06:59:37.793056 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:37 crc kubenswrapper[4626]: E0223 06:59:37.793153 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:37 crc kubenswrapper[4626]: E0223 06:59:37.793964 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnqdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-fttdm_openstack(c1806f1a-08dd-4b17-a799-1122348a4ab3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:59:37 crc kubenswrapper[4626]: E0223 06:59:37.795199 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-fttdm" podUID="c1806f1a-08dd-4b17-a799-1122348a4ab3" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.801622 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:37 crc kubenswrapper[4626]: I0223 06:59:37.858022 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.004158 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-logs\") pod \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.004559 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-logs" (OuterVolumeSpecName: "logs") pod "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" (UID: "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.004741 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-horizon-secret-key\") pod \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.006352 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-config-data\") pod \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.006427 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kmpv\" (UniqueName: \"kubernetes.io/projected/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-kube-api-access-8kmpv\") pod \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.006516 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-scripts\") pod \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\" (UID: \"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.008972 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-scripts" (OuterVolumeSpecName: "scripts") pod "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" (UID: "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.009013 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-config-data" (OuterVolumeSpecName: "config-data") pod "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" (UID: "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.011794 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.011985 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.012004 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.015150 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-kube-api-access-8kmpv" (OuterVolumeSpecName: "kube-api-access-8kmpv") pod "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" (UID: "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f"). InnerVolumeSpecName "kube-api-access-8kmpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.015713 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" (UID: "bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.114372 4626 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.114411 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kmpv\" (UniqueName: \"kubernetes.io/projected/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f-kube-api-access-8kmpv\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: E0223 06:59:38.142460 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:38 crc kubenswrapper[4626]: E0223 06:59:38.142545 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:38 crc kubenswrapper[4626]: E0223 06:59:38.142743 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-central:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58ch57dh688h559hf6h696h555h57dh69h5d6h548h76hc8h597hdbh66hcdhb7h549hf5h558h5f6h85h696h64fh6bh69h96h5d8h6ch686h69q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvp9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.172411 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.209733 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99b585b75-58xkk" event={"ID":"1c870a4f-2c97-4836-9522-fd73c0a9d3ef","Type":"ContainerDied","Data":"025e803dc16bcd91ec64717b4521a5c89b0ca31980cba7fa3f0769b39723a209"} Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.209887 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99b585b75-58xkk" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.211439 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd4f7c55-mmtls" event={"ID":"bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f","Type":"ContainerDied","Data":"86bb47b80555c5fe7492c3c794b4f979ebe539708cc9035943e911ae964211df"} Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.211552 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd4f7c55-mmtls" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.215986 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-544ls\" (UniqueName: \"kubernetes.io/projected/b40e0f9f-9c08-4344-ad79-47d1663967b9-kube-api-access-544ls\") pod \"b40e0f9f-9c08-4344-ad79-47d1663967b9\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216034 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40e0f9f-9c08-4344-ad79-47d1663967b9-logs\") pod \"b40e0f9f-9c08-4344-ad79-47d1663967b9\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216094 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-scripts\") pod \"b40e0f9f-9c08-4344-ad79-47d1663967b9\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216196 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-config-data\") pod \"b40e0f9f-9c08-4344-ad79-47d1663967b9\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216384 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b40e0f9f-9c08-4344-ad79-47d1663967b9-horizon-secret-key\") pod \"b40e0f9f-9c08-4344-ad79-47d1663967b9\" (UID: \"b40e0f9f-9c08-4344-ad79-47d1663967b9\") " Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216455 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b40e0f9f-9c08-4344-ad79-47d1663967b9-logs" (OuterVolumeSpecName: "logs") pod "b40e0f9f-9c08-4344-ad79-47d1663967b9" (UID: "b40e0f9f-9c08-4344-ad79-47d1663967b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216785 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-scripts" (OuterVolumeSpecName: "scripts") pod "b40e0f9f-9c08-4344-ad79-47d1663967b9" (UID: "b40e0f9f-9c08-4344-ad79-47d1663967b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.216969 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-config-data" (OuterVolumeSpecName: "config-data") pod "b40e0f9f-9c08-4344-ad79-47d1663967b9" (UID: "b40e0f9f-9c08-4344-ad79-47d1663967b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.217874 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5d74467f-dcfkd" event={"ID":"b40e0f9f-9c08-4344-ad79-47d1663967b9","Type":"ContainerDied","Data":"4119ebbde0e31b880dece4174dc50a534a73f7eb80d56fa969861733bcefd739"} Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.217894 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5d74467f-dcfkd" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.218570 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b40e0f9f-9c08-4344-ad79-47d1663967b9-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.218845 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.218872 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b40e0f9f-9c08-4344-ad79-47d1663967b9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: E0223 06:59:38.220812 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/heat-db-sync-fttdm" podUID="c1806f1a-08dd-4b17-a799-1122348a4ab3" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.230826 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40e0f9f-9c08-4344-ad79-47d1663967b9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b40e0f9f-9c08-4344-ad79-47d1663967b9" (UID: "b40e0f9f-9c08-4344-ad79-47d1663967b9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.231353 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40e0f9f-9c08-4344-ad79-47d1663967b9-kube-api-access-544ls" (OuterVolumeSpecName: "kube-api-access-544ls") pod "b40e0f9f-9c08-4344-ad79-47d1663967b9" (UID: "b40e0f9f-9c08-4344-ad79-47d1663967b9"). InnerVolumeSpecName "kube-api-access-544ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.236243 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99b585b75-58xkk"] Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.240724 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-99b585b75-58xkk"] Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.285313 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67fd4f7c55-mmtls"] Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.296193 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67fd4f7c55-mmtls"] Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.321222 4626 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b40e0f9f-9c08-4344-ad79-47d1663967b9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.321369 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-544ls\" (UniqueName: \"kubernetes.io/projected/b40e0f9f-9c08-4344-ad79-47d1663967b9-kube-api-access-544ls\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.601752 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5d74467f-dcfkd"] Feb 23 06:59:38 crc kubenswrapper[4626]: I0223 06:59:38.616059 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b5d74467f-dcfkd"] Feb 23 06:59:39 crc kubenswrapper[4626]: I0223 06:59:39.183323 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-99b585b75-58xkk" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Feb 23 06:59:39 crc kubenswrapper[4626]: E0223 06:59:39.346488 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:39 crc kubenswrapper[4626]: E0223 06:59:39.346621 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb" Feb 23 06:59:39 crc kubenswrapper[4626]: E0223 06:59:39.346897 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k4nmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-47npm_openstack(c5cc94ca-558e-4a2c-8d28-5aedbecb3090): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 06:59:39 crc kubenswrapper[4626]: E0223 06:59:39.348131 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-47npm" podUID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" Feb 23 06:59:39 crc kubenswrapper[4626]: W0223 06:59:39.374945 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794ef965_1710_40d6_93ce_fc78c9799816.slice/crio-49964ef277a7377912e18efe1817329028c81aa7cf0121c360882f8453be18ee WatchSource:0}: Error finding container 49964ef277a7377912e18efe1817329028c81aa7cf0121c360882f8453be18ee: Status 404 returned error can't find the container with id 49964ef277a7377912e18efe1817329028c81aa7cf0121c360882f8453be18ee Feb 23 06:59:39 crc kubenswrapper[4626]: I0223 06:59:39.414738 4626 scope.go:117] "RemoveContainer" containerID="f174a150e2af866afa793084424f2b440d6fce9475cb386254854f53188be4db" Feb 23 06:59:39 crc kubenswrapper[4626]: I0223 06:59:39.604989 4626 scope.go:117] "RemoveContainer" containerID="2634bbd63023c0de3a1f692a02fc08f73ed899056ccd6f23e85d165bbfa74175" Feb 23 06:59:39 crc kubenswrapper[4626]: I0223 06:59:39.842062 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6x8cq"] Feb 23 06:59:39 crc kubenswrapper[4626]: I0223 06:59:39.975087 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-688bccf86-4crkw"] Feb 23 06:59:40 crc kubenswrapper[4626]: W0223 06:59:40.004381 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e1e535_58de_4987_9d93_65fb6d4c9409.slice/crio-788393d74daf1deccddb6f3f3812ea6ef9df04a07449879926fb8eed37a9dc1b WatchSource:0}: Error finding container 788393d74daf1deccddb6f3f3812ea6ef9df04a07449879926fb8eed37a9dc1b: Status 404 returned error can't find the container with id 788393d74daf1deccddb6f3f3812ea6ef9df04a07449879926fb8eed37a9dc1b Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.031741 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" path="/var/lib/kubelet/pods/1c870a4f-2c97-4836-9522-fd73c0a9d3ef/volumes" Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.032577 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40e0f9f-9c08-4344-ad79-47d1663967b9" path="/var/lib/kubelet/pods/b40e0f9f-9c08-4344-ad79-47d1663967b9/volumes" Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.032984 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f" path="/var/lib/kubelet/pods/bd2e1cea-a35e-4476-bd6c-54ce79fc2e5f/volumes" Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.071667 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9c5c7b856-snkxr"] Feb 23 06:59:40 crc kubenswrapper[4626]: W0223 06:59:40.082332 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8624d986_dff6_40bd_937d_755c2ca809d9.slice/crio-d1466fa67cdc3b13ffb527d31f4d6111aa7b04af96e278d8aaae713591e1e044 WatchSource:0}: Error finding container d1466fa67cdc3b13ffb527d31f4d6111aa7b04af96e278d8aaae713591e1e044: Status 404 returned error can't find the container with id d1466fa67cdc3b13ffb527d31f4d6111aa7b04af96e278d8aaae713591e1e044 Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.184790 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd9b88b4f-724lk"] Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.245742 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" event={"ID":"8dcb0164-5b5e-456b-9918-f42483a73601","Type":"ContainerStarted","Data":"cbf0d9dee681d5e2aed577d3aae8d66585ce818c880d6712aedc117a58d8b0f4"} Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.248645 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x8cq" event={"ID":"99407a1e-403e-4460-8bff-8eb644010b4c","Type":"ContainerStarted","Data":"c3eb5e5ed5403ffe30c6132a2206efd37536c5594917200c7d05cc3d92f38279"} Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.255119 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59898cd8f5-xpkhf" event={"ID":"7b2634f7-a041-40b7-beb0-36e366627314","Type":"ContainerStarted","Data":"fdbb1fd152f3bb632f553e9ac09fd4e0fea49c8112c24a80099edf080a6eeadb"} Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.256371 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c5c7b856-snkxr" event={"ID":"8624d986-dff6-40bd-937d-755c2ca809d9","Type":"ContainerStarted","Data":"d1466fa67cdc3b13ffb527d31f4d6111aa7b04af96e278d8aaae713591e1e044"} Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.257733 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-848f958bdf-dbqk8"] Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.258719 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"794ef965-1710-40d6-93ce-fc78c9799816","Type":"ContainerStarted","Data":"49964ef277a7377912e18efe1817329028c81aa7cf0121c360882f8453be18ee"} Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.263746 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-688bccf86-4crkw" event={"ID":"d3e1e535-58de-4987-9d93-65fb6d4c9409","Type":"ContainerStarted","Data":"788393d74daf1deccddb6f3f3812ea6ef9df04a07449879926fb8eed37a9dc1b"} Feb 23 06:59:40 crc kubenswrapper[4626]: E0223 06:59:40.267489 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/cinder-db-sync-47npm" podUID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" Feb 23 06:59:40 crc kubenswrapper[4626]: I0223 06:59:40.877741 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-668d74f4c6-dk9gw"] Feb 23 06:59:40 crc kubenswrapper[4626]: W0223 06:59:40.910106 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0643197_bfab_42f2_bfef_85ab66daf967.slice/crio-fd27815428914fc07ee97e6c6fae7f44495c86ef62b2464a2733527953eee774 WatchSource:0}: Error finding container fd27815428914fc07ee97e6c6fae7f44495c86ef62b2464a2733527953eee774: Status 404 returned error can't find the container with id fd27815428914fc07ee97e6c6fae7f44495c86ef62b2464a2733527953eee774 Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.318885 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7stdk" event={"ID":"a203dc9f-43a6-4cf4-ac68-7c5125053cba","Type":"ContainerStarted","Data":"71904e160c42cf04472b0c2a1700ca0d85acc9141bf20632c450a5294151df1b"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.326879 4626 generic.go:334] "Generic (PLEG): container finished" podID="8dcb0164-5b5e-456b-9918-f42483a73601" containerID="ad17177ef36fac13085c50e9ad24262a26c30c8a6c34e7e88ef3c219c3e64999" exitCode=0 Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.327105 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" event={"ID":"8dcb0164-5b5e-456b-9918-f42483a73601","Type":"ContainerDied","Data":"ad17177ef36fac13085c50e9ad24262a26c30c8a6c34e7e88ef3c219c3e64999"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.342963 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x8cq" event={"ID":"99407a1e-403e-4460-8bff-8eb644010b4c","Type":"ContainerStarted","Data":"62d1aeac828dbe31c2b117686504cfca86362d4c695dde2b94f097502f9b946f"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.344980 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-848f958bdf-dbqk8" event={"ID":"68c3a490-db52-4fed-baf1-07c3cf9b06bc","Type":"ContainerStarted","Data":"5910599f46b3fdc6c40589f88c6a1b675f88837cad419ce983039d97c9e5595b"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.345011 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-848f958bdf-dbqk8" event={"ID":"68c3a490-db52-4fed-baf1-07c3cf9b06bc","Type":"ContainerStarted","Data":"3090702eb32cae40cade39268a090cfd12ecbacef56880571515fee04f33a1bc"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.350487 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7stdk" podStartSLOduration=4.321959088 podStartE2EDuration="38.350455886s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="2026-02-23 06:59:05.343231743 +0000 UTC m=+1097.682561009" lastFinishedPulling="2026-02-23 06:59:39.371728542 +0000 UTC m=+1131.711057807" observedRunningTime="2026-02-23 06:59:41.334596852 +0000 UTC m=+1133.673926108" watchObservedRunningTime="2026-02-23 06:59:41.350455886 +0000 UTC m=+1133.689785152" Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.370269 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59898cd8f5-xpkhf" event={"ID":"7b2634f7-a041-40b7-beb0-36e366627314","Type":"ContainerStarted","Data":"8a52ca9edb3dc9d8c1c17706c7dbbf5e4738155850985c8a92d027b4fb6091e8"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.370416 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-59898cd8f5-xpkhf" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon-log" containerID="cri-o://fdbb1fd152f3bb632f553e9ac09fd4e0fea49c8112c24a80099edf080a6eeadb" gracePeriod=30 Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.370671 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-59898cd8f5-xpkhf" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon" containerID="cri-o://8a52ca9edb3dc9d8c1c17706c7dbbf5e4738155850985c8a92d027b4fb6091e8" gracePeriod=30 Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.375695 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"794ef965-1710-40d6-93ce-fc78c9799816","Type":"ContainerStarted","Data":"dc78b29f4c4442aefbf175ce70ed96229c271f067a54cc0e7fd67fa804cb1bf5"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.382576 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"38f3c8f5-dca2-4b37-81cb-6d25edbf1035","Type":"ContainerStarted","Data":"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.387724 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668d74f4c6-dk9gw" event={"ID":"d0643197-bfab-42f2-bfef-85ab66daf967","Type":"ContainerStarted","Data":"fd27815428914fc07ee97e6c6fae7f44495c86ef62b2464a2733527953eee774"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.392381 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nxxgl" event={"ID":"498f5c1c-5f75-49e1-909a-e7ce904ebd9d","Type":"ContainerStarted","Data":"887e4fb3093570f30aaf87674069d3a964dda7b9537cf7d6ece01231c19c72aa"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.469180 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nxxgl" podStartSLOduration=4.403374767 podStartE2EDuration="38.469166044s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="2026-02-23 06:59:05.3059458 +0000 UTC m=+1097.645275066" lastFinishedPulling="2026-02-23 06:59:39.371737078 +0000 UTC m=+1131.711066343" observedRunningTime="2026-02-23 06:59:41.463129565 +0000 UTC m=+1133.802458831" watchObservedRunningTime="2026-02-23 06:59:41.469166044 +0000 UTC m=+1133.808495311" Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.475758 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6x8cq" podStartSLOduration=18.475734756 podStartE2EDuration="18.475734756s" podCreationTimestamp="2026-02-23 06:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:41.429741949 +0000 UTC m=+1133.769071216" watchObservedRunningTime="2026-02-23 06:59:41.475734756 +0000 UTC m=+1133.815064022" Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.508682 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-59898cd8f5-xpkhf" podStartSLOduration=5.179723693 podStartE2EDuration="35.508670569s" podCreationTimestamp="2026-02-23 06:59:06 +0000 UTC" firstStartedPulling="2026-02-23 06:59:07.805880316 +0000 UTC m=+1100.145209582" lastFinishedPulling="2026-02-23 06:59:38.134827202 +0000 UTC m=+1130.474156458" observedRunningTime="2026-02-23 06:59:41.497342205 +0000 UTC m=+1133.836671462" watchObservedRunningTime="2026-02-23 06:59:41.508670569 +0000 UTC m=+1133.847999836" Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.532123 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c5c7b856-snkxr" event={"ID":"8624d986-dff6-40bd-937d-755c2ca809d9","Type":"ContainerStarted","Data":"60f2992022985555c9b19a0c7fb549a7d54427452a1fd539453a54ef31a10a6b"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.532391 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c5c7b856-snkxr" event={"ID":"8624d986-dff6-40bd-937d-755c2ca809d9","Type":"ContainerStarted","Data":"a5a014efa76b7de50de6290e6814485f21418749702dd366d652bb5a811d9fcf"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.585930 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-688bccf86-4crkw" event={"ID":"d3e1e535-58de-4987-9d93-65fb6d4c9409","Type":"ContainerStarted","Data":"2c5749f4aadf1259b28bcb92c0f7886d9c2489347b5612f43e45564e75ed8820"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.585980 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-688bccf86-4crkw" event={"ID":"d3e1e535-58de-4987-9d93-65fb6d4c9409","Type":"ContainerStarted","Data":"f5b2605da52049890556f18c6c36f20be83c9a1c80c2a9df82660b7592b24f7c"} Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.632010 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9c5c7b856-snkxr" podStartSLOduration=28.631983333 podStartE2EDuration="28.631983333s" podCreationTimestamp="2026-02-23 06:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:41.559368876 +0000 UTC m=+1133.898698132" watchObservedRunningTime="2026-02-23 06:59:41.631983333 +0000 UTC m=+1133.971312599" Feb 23 06:59:41 crc kubenswrapper[4626]: I0223 06:59:41.638196 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-688bccf86-4crkw" podStartSLOduration=28.638184772 podStartE2EDuration="28.638184772s" podCreationTimestamp="2026-02-23 06:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:41.623082265 +0000 UTC m=+1133.962411531" watchObservedRunningTime="2026-02-23 06:59:41.638184772 +0000 UTC m=+1133.977514039" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.596204 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"794ef965-1710-40d6-93ce-fc78c9799816","Type":"ContainerStarted","Data":"7719649822c37240944f8185d05b9d75b2c83ddf2ba1294fab32b832475a81f7"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.596311 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-log" containerID="cri-o://dc78b29f4c4442aefbf175ce70ed96229c271f067a54cc0e7fd67fa804cb1bf5" gracePeriod=30 Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.596601 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-httpd" containerID="cri-o://7719649822c37240944f8185d05b9d75b2c83ddf2ba1294fab32b832475a81f7" gracePeriod=30 Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.604887 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"38f3c8f5-dca2-4b37-81cb-6d25edbf1035","Type":"ContainerStarted","Data":"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.605036 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-log" containerID="cri-o://da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe" gracePeriod=30 Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.605140 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-httpd" containerID="cri-o://d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe" gracePeriod=30 Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.613356 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" event={"ID":"8dcb0164-5b5e-456b-9918-f42483a73601","Type":"ContainerStarted","Data":"949ba8dbc8bc52479eeeff167ea47af212c50fdd1f26639d7198135fb37578a8"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.614642 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.641380 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-848f958bdf-dbqk8" event={"ID":"68c3a490-db52-4fed-baf1-07c3cf9b06bc","Type":"ContainerStarted","Data":"01305754c22a518c1d956f73ab61f2f9e3770c5759554be086ff146263c604af"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.642251 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.643158 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.64314783 podStartE2EDuration="31.64314783s" podCreationTimestamp="2026-02-23 06:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:42.622091599 +0000 UTC m=+1134.961420855" watchObservedRunningTime="2026-02-23 06:59:42.64314783 +0000 UTC m=+1134.982477086" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.644434 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=32.644429055 podStartE2EDuration="32.644429055s" podCreationTimestamp="2026-02-23 06:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:42.639079471 +0000 UTC m=+1134.978408737" watchObservedRunningTime="2026-02-23 06:59:42.644429055 +0000 UTC m=+1134.983758321" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.650459 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668d74f4c6-dk9gw" event={"ID":"d0643197-bfab-42f2-bfef-85ab66daf967","Type":"ContainerStarted","Data":"f789f38e1ff4cfc9f3aea55c1b7d7f406c1e5477a4a6ba46ca01c341a465e91c"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.650529 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668d74f4c6-dk9gw" event={"ID":"d0643197-bfab-42f2-bfef-85ab66daf967","Type":"ContainerStarted","Data":"6b4f7424acae7dc62759458d73b53320f54d2f8667120a35c994c37d533f5e9b"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.651718 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.663414 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35","Type":"ContainerStarted","Data":"085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e"} Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.672536 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" podStartSLOduration=12.672520768 podStartE2EDuration="12.672520768s" podCreationTimestamp="2026-02-23 06:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:42.663441404 +0000 UTC m=+1135.002770670" watchObservedRunningTime="2026-02-23 06:59:42.672520768 +0000 UTC m=+1135.011850034" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.710573 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-668d74f4c6-dk9gw" podStartSLOduration=12.710556414 podStartE2EDuration="12.710556414s" podCreationTimestamp="2026-02-23 06:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:42.685958718 +0000 UTC m=+1135.025287983" watchObservedRunningTime="2026-02-23 06:59:42.710556414 +0000 UTC m=+1135.049885670" Feb 23 06:59:42 crc kubenswrapper[4626]: I0223 06:59:42.733872 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-848f958bdf-dbqk8" podStartSLOduration=10.733861464 podStartE2EDuration="10.733861464s" podCreationTimestamp="2026-02-23 06:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:42.711019867 +0000 UTC m=+1135.050349134" watchObservedRunningTime="2026-02-23 06:59:42.733861464 +0000 UTC m=+1135.073190730" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.623236 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.643759 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.643802 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727320 4626 generic.go:334] "Generic (PLEG): container finished" podID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerID="d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe" exitCode=0 Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727347 4626 generic.go:334] "Generic (PLEG): container finished" podID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerID="da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe" exitCode=143 Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727396 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"38f3c8f5-dca2-4b37-81cb-6d25edbf1035","Type":"ContainerDied","Data":"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe"} Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727428 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"38f3c8f5-dca2-4b37-81cb-6d25edbf1035","Type":"ContainerDied","Data":"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe"} Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727438 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"38f3c8f5-dca2-4b37-81cb-6d25edbf1035","Type":"ContainerDied","Data":"6f1c450013d3b52c28e04b567b136a2c31a1add052afabd57c9cf511f32bea3b"} Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727454 4626 scope.go:117] "RemoveContainer" containerID="d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.727639 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.739147 4626 generic.go:334] "Generic (PLEG): container finished" podID="794ef965-1710-40d6-93ce-fc78c9799816" containerID="7719649822c37240944f8185d05b9d75b2c83ddf2ba1294fab32b832475a81f7" exitCode=0 Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.739171 4626 generic.go:334] "Generic (PLEG): container finished" podID="794ef965-1710-40d6-93ce-fc78c9799816" containerID="dc78b29f4c4442aefbf175ce70ed96229c271f067a54cc0e7fd67fa804cb1bf5" exitCode=143 Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.739894 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"794ef965-1710-40d6-93ce-fc78c9799816","Type":"ContainerDied","Data":"7719649822c37240944f8185d05b9d75b2c83ddf2ba1294fab32b832475a81f7"} Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.739929 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"794ef965-1710-40d6-93ce-fc78c9799816","Type":"ContainerDied","Data":"dc78b29f4c4442aefbf175ce70ed96229c271f067a54cc0e7fd67fa804cb1bf5"} Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.783160 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.783613 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-688bccf86-4crkw" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.792941 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-public-tls-certs\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.792987 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-scripts\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.793030 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6wcm\" (UniqueName: \"kubernetes.io/projected/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-kube-api-access-n6wcm\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.793123 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-httpd-run\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.793142 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-combined-ca-bundle\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.793185 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-logs\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.793210 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-config-data\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.793276 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\" (UID: \"38f3c8f5-dca2-4b37-81cb-6d25edbf1035\") " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.797737 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-logs" (OuterVolumeSpecName: "logs") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.809365 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-scripts" (OuterVolumeSpecName: "scripts") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.809685 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.827655 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.830437 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-kube-api-access-n6wcm" (OuterVolumeSpecName: "kube-api-access-n6wcm") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "kube-api-access-n6wcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.847774 4626 scope.go:117] "RemoveContainer" containerID="da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.850785 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.863032 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.913993 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.914240 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.914249 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.914274 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.914285 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.914295 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6wcm\" (UniqueName: \"kubernetes.io/projected/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-kube-api-access-n6wcm\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.934539 4626 scope.go:117] "RemoveContainer" containerID="d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe" Feb 23 06:59:43 crc kubenswrapper[4626]: E0223 06:59:43.935659 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe\": container with ID starting with d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe not found: ID does not exist" containerID="d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.935700 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe"} err="failed to get container status \"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe\": rpc error: code = NotFound desc = could not find container \"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe\": container with ID starting with d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe not found: ID does not exist" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.935722 4626 scope.go:117] "RemoveContainer" containerID="da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe" Feb 23 06:59:43 crc kubenswrapper[4626]: E0223 06:59:43.936105 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe\": container with ID starting with da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe not found: ID does not exist" containerID="da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.936139 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe"} err="failed to get container status \"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe\": rpc error: code = NotFound desc = could not find container \"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe\": container with ID starting with da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe not found: ID does not exist" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.936154 4626 scope.go:117] "RemoveContainer" containerID="d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.938682 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe"} err="failed to get container status \"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe\": rpc error: code = NotFound desc = could not find container \"d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe\": container with ID starting with d538ffbe4b79c6d0c7b7bc1c5a94817c098133d17af263915d4fc02737b03ebe not found: ID does not exist" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.938708 4626 scope.go:117] "RemoveContainer" containerID="da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.940105 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe"} err="failed to get container status \"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe\": rpc error: code = NotFound desc = could not find container \"da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe\": container with ID starting with da773430f5cfa96ff721f498afb034ad72cab88025e39b0f3329a9bff268d5fe not found: ID does not exist" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.944458 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-config-data" (OuterVolumeSpecName: "config-data") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.948822 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "38f3c8f5-dca2-4b37-81cb-6d25edbf1035" (UID: "38f3c8f5-dca2-4b37-81cb-6d25edbf1035"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:43 crc kubenswrapper[4626]: I0223 06:59:43.954390 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.015965 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-config-data\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016033 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4hxp\" (UniqueName: \"kubernetes.io/projected/794ef965-1710-40d6-93ce-fc78c9799816-kube-api-access-j4hxp\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016108 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-internal-tls-certs\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016141 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-combined-ca-bundle\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016193 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016221 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-logs\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016238 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-httpd-run\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.016268 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-scripts\") pod \"794ef965-1710-40d6-93ce-fc78c9799816\" (UID: \"794ef965-1710-40d6-93ce-fc78c9799816\") " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.017284 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-logs" (OuterVolumeSpecName: "logs") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.017439 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.018415 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.018436 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.018451 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ef965-1710-40d6-93ce-fc78c9799816-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.018462 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.018473 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f3c8f5-dca2-4b37-81cb-6d25edbf1035-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.026026 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794ef965-1710-40d6-93ce-fc78c9799816-kube-api-access-j4hxp" (OuterVolumeSpecName: "kube-api-access-j4hxp") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "kube-api-access-j4hxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.031932 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-scripts" (OuterVolumeSpecName: "scripts") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.035859 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.067801 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.067815 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.085377 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.101733 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: E0223 06:59:44.102152 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-log" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102173 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-log" Feb 23 06:59:44 crc kubenswrapper[4626]: E0223 06:59:44.102193 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="init" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102199 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="init" Feb 23 06:59:44 crc kubenswrapper[4626]: E0223 06:59:44.102209 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-log" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102215 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-log" Feb 23 06:59:44 crc kubenswrapper[4626]: E0223 06:59:44.102227 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-httpd" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102233 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-httpd" Feb 23 06:59:44 crc kubenswrapper[4626]: E0223 06:59:44.102245 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102251 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" Feb 23 06:59:44 crc kubenswrapper[4626]: E0223 06:59:44.102269 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-httpd" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102274 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-httpd" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102430 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-log" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102447 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c870a4f-2c97-4836-9522-fd73c0a9d3ef" containerName="dnsmasq-dns" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102457 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" containerName="glance-httpd" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102469 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-httpd" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.102482 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ef965-1710-40d6-93ce-fc78c9799816" containerName="glance-log" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.103313 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.108124 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.108995 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.119845 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4hxp\" (UniqueName: \"kubernetes.io/projected/794ef965-1710-40d6-93ce-fc78c9799816-kube-api-access-j4hxp\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.119865 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.119889 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.119899 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.138588 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-config-data" (OuterVolumeSpecName: "config-data") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.148656 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.155530 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "794ef965-1710-40d6-93ce-fc78c9799816" (UID: "794ef965-1710-40d6-93ce-fc78c9799816"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.161465 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223174 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223369 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-config-data\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223422 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223775 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxx84\" (UniqueName: \"kubernetes.io/projected/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-kube-api-access-kxx84\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223858 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-scripts\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223874 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223897 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.223925 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-logs\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.224078 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.224089 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.224099 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ef965-1710-40d6-93ce-fc78c9799816-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326348 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxx84\" (UniqueName: \"kubernetes.io/projected/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-kube-api-access-kxx84\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326406 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326428 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-scripts\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326448 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326466 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-logs\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326542 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326611 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-config-data\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.326634 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.327295 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.327830 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-logs\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.328603 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.339902 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-scripts\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.341677 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.346028 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-config-data\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.348694 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.350248 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxx84\" (UniqueName: \"kubernetes.io/projected/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-kube-api-access-kxx84\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.365671 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.432215 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.750560 4626 generic.go:334] "Generic (PLEG): container finished" podID="498f5c1c-5f75-49e1-909a-e7ce904ebd9d" containerID="887e4fb3093570f30aaf87674069d3a964dda7b9537cf7d6ece01231c19c72aa" exitCode=0 Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.751278 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nxxgl" event={"ID":"498f5c1c-5f75-49e1-909a-e7ce904ebd9d","Type":"ContainerDied","Data":"887e4fb3093570f30aaf87674069d3a964dda7b9537cf7d6ece01231c19c72aa"} Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.753567 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"794ef965-1710-40d6-93ce-fc78c9799816","Type":"ContainerDied","Data":"49964ef277a7377912e18efe1817329028c81aa7cf0121c360882f8453be18ee"} Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.753628 4626 scope.go:117] "RemoveContainer" containerID="7719649822c37240944f8185d05b9d75b2c83ddf2ba1294fab32b832475a81f7" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.753772 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.765223 4626 generic.go:334] "Generic (PLEG): container finished" podID="a203dc9f-43a6-4cf4-ac68-7c5125053cba" containerID="71904e160c42cf04472b0c2a1700ca0d85acc9141bf20632c450a5294151df1b" exitCode=0 Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.767244 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7stdk" event={"ID":"a203dc9f-43a6-4cf4-ac68-7c5125053cba","Type":"ContainerDied","Data":"71904e160c42cf04472b0c2a1700ca0d85acc9141bf20632c450a5294151df1b"} Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.788418 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.798751 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.821311 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.829134 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.832205 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.832289 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.841140 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945193 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945257 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945299 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j8gd\" (UniqueName: \"kubernetes.io/projected/58f4f2ad-1b9a-4be4-a535-32183ce254a6-kube-api-access-8j8gd\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945339 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945401 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945461 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945480 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:44 crc kubenswrapper[4626]: I0223 06:59:44.945510 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.051167 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.051627 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.051729 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j8gd\" (UniqueName: \"kubernetes.io/projected/58f4f2ad-1b9a-4be4-a535-32183ce254a6-kube-api-access-8j8gd\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.051787 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.051809 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.052251 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.052466 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.052522 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.052547 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.053162 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.053276 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.060020 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.060329 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.073060 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.074174 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.076853 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j8gd\" (UniqueName: \"kubernetes.io/projected/58f4f2ad-1b9a-4be4-a535-32183ce254a6-kube-api-access-8j8gd\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.100628 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.153134 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.775284 4626 generic.go:334] "Generic (PLEG): container finished" podID="99407a1e-403e-4460-8bff-8eb644010b4c" containerID="62d1aeac828dbe31c2b117686504cfca86362d4c695dde2b94f097502f9b946f" exitCode=0 Feb 23 06:59:45 crc kubenswrapper[4626]: I0223 06:59:45.775386 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x8cq" event={"ID":"99407a1e-403e-4460-8bff-8eb644010b4c","Type":"ContainerDied","Data":"62d1aeac828dbe31c2b117686504cfca86362d4c695dde2b94f097502f9b946f"} Feb 23 06:59:46 crc kubenswrapper[4626]: I0223 06:59:46.007211 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f3c8f5-dca2-4b37-81cb-6d25edbf1035" path="/var/lib/kubelet/pods/38f3c8f5-dca2-4b37-81cb-6d25edbf1035/volumes" Feb 23 06:59:46 crc kubenswrapper[4626]: I0223 06:59:46.008232 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794ef965-1710-40d6-93ce-fc78c9799816" path="/var/lib/kubelet/pods/794ef965-1710-40d6-93ce-fc78c9799816/volumes" Feb 23 06:59:46 crc kubenswrapper[4626]: I0223 06:59:46.894067 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.395122 4626 scope.go:117] "RemoveContainer" containerID="dc78b29f4c4442aefbf175ce70ed96229c271f067a54cc0e7fd67fa804cb1bf5" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.438666 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.474224 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.475754 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626524 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-config-data\") pod \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626563 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-config-data\") pod \"99407a1e-403e-4460-8bff-8eb644010b4c\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626613 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a203dc9f-43a6-4cf4-ac68-7c5125053cba-logs\") pod \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626632 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-db-sync-config-data\") pod \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626680 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-fernet-keys\") pod \"99407a1e-403e-4460-8bff-8eb644010b4c\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626718 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nlxl\" (UniqueName: \"kubernetes.io/projected/a203dc9f-43a6-4cf4-ac68-7c5125053cba-kube-api-access-4nlxl\") pod \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626808 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-combined-ca-bundle\") pod \"99407a1e-403e-4460-8bff-8eb644010b4c\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626862 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-scripts\") pod \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626901 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhfbw\" (UniqueName: \"kubernetes.io/projected/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-kube-api-access-jhfbw\") pod \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626937 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-combined-ca-bundle\") pod \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\" (UID: \"498f5c1c-5f75-49e1-909a-e7ce904ebd9d\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.626959 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-scripts\") pod \"99407a1e-403e-4460-8bff-8eb644010b4c\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.627080 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-combined-ca-bundle\") pod \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\" (UID: \"a203dc9f-43a6-4cf4-ac68-7c5125053cba\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.627123 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-credential-keys\") pod \"99407a1e-403e-4460-8bff-8eb644010b4c\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.627140 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjb5n\" (UniqueName: \"kubernetes.io/projected/99407a1e-403e-4460-8bff-8eb644010b4c-kube-api-access-bjb5n\") pod \"99407a1e-403e-4460-8bff-8eb644010b4c\" (UID: \"99407a1e-403e-4460-8bff-8eb644010b4c\") " Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.628573 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a203dc9f-43a6-4cf4-ac68-7c5125053cba-logs" (OuterVolumeSpecName: "logs") pod "a203dc9f-43a6-4cf4-ac68-7c5125053cba" (UID: "a203dc9f-43a6-4cf4-ac68-7c5125053cba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.649562 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "498f5c1c-5f75-49e1-909a-e7ce904ebd9d" (UID: "498f5c1c-5f75-49e1-909a-e7ce904ebd9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.656247 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-scripts" (OuterVolumeSpecName: "scripts") pod "a203dc9f-43a6-4cf4-ac68-7c5125053cba" (UID: "a203dc9f-43a6-4cf4-ac68-7c5125053cba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.662970 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-kube-api-access-jhfbw" (OuterVolumeSpecName: "kube-api-access-jhfbw") pod "498f5c1c-5f75-49e1-909a-e7ce904ebd9d" (UID: "498f5c1c-5f75-49e1-909a-e7ce904ebd9d"). InnerVolumeSpecName "kube-api-access-jhfbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.665443 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "498f5c1c-5f75-49e1-909a-e7ce904ebd9d" (UID: "498f5c1c-5f75-49e1-909a-e7ce904ebd9d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.666350 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "99407a1e-403e-4460-8bff-8eb644010b4c" (UID: "99407a1e-403e-4460-8bff-8eb644010b4c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.666457 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99407a1e-403e-4460-8bff-8eb644010b4c-kube-api-access-bjb5n" (OuterVolumeSpecName: "kube-api-access-bjb5n") pod "99407a1e-403e-4460-8bff-8eb644010b4c" (UID: "99407a1e-403e-4460-8bff-8eb644010b4c"). InnerVolumeSpecName "kube-api-access-bjb5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.667675 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-scripts" (OuterVolumeSpecName: "scripts") pod "99407a1e-403e-4460-8bff-8eb644010b4c" (UID: "99407a1e-403e-4460-8bff-8eb644010b4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.670003 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "99407a1e-403e-4460-8bff-8eb644010b4c" (UID: "99407a1e-403e-4460-8bff-8eb644010b4c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.672706 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a203dc9f-43a6-4cf4-ac68-7c5125053cba-kube-api-access-4nlxl" (OuterVolumeSpecName: "kube-api-access-4nlxl") pod "a203dc9f-43a6-4cf4-ac68-7c5125053cba" (UID: "a203dc9f-43a6-4cf4-ac68-7c5125053cba"). InnerVolumeSpecName "kube-api-access-4nlxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.674593 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-config-data" (OuterVolumeSpecName: "config-data") pod "a203dc9f-43a6-4cf4-ac68-7c5125053cba" (UID: "a203dc9f-43a6-4cf4-ac68-7c5125053cba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.677600 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-config-data" (OuterVolumeSpecName: "config-data") pod "99407a1e-403e-4460-8bff-8eb644010b4c" (UID: "99407a1e-403e-4460-8bff-8eb644010b4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.679721 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a203dc9f-43a6-4cf4-ac68-7c5125053cba" (UID: "a203dc9f-43a6-4cf4-ac68-7c5125053cba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.709891 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99407a1e-403e-4460-8bff-8eb644010b4c" (UID: "99407a1e-403e-4460-8bff-8eb644010b4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730559 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nlxl\" (UniqueName: \"kubernetes.io/projected/a203dc9f-43a6-4cf4-ac68-7c5125053cba-kube-api-access-4nlxl\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730637 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730651 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730664 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhfbw\" (UniqueName: \"kubernetes.io/projected/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-kube-api-access-jhfbw\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730673 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730682 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730692 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730701 4626 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730711 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjb5n\" (UniqueName: \"kubernetes.io/projected/99407a1e-403e-4460-8bff-8eb644010b4c-kube-api-access-bjb5n\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730721 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a203dc9f-43a6-4cf4-ac68-7c5125053cba-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730733 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730791 4626 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f5c1c-5f75-49e1-909a-e7ce904ebd9d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730800 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a203dc9f-43a6-4cf4-ac68-7c5125053cba-logs\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.730809 4626 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99407a1e-403e-4460-8bff-8eb644010b4c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.819582 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nxxgl" event={"ID":"498f5c1c-5f75-49e1-909a-e7ce904ebd9d","Type":"ContainerDied","Data":"72b6a51bf784104dcdb1e1b09493666adca85d09636aec99ee126245582619d8"} Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.819932 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b6a51bf784104dcdb1e1b09493666adca85d09636aec99ee126245582619d8" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.820006 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nxxgl" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.832972 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7stdk" event={"ID":"a203dc9f-43a6-4cf4-ac68-7c5125053cba","Type":"ContainerDied","Data":"4fa4c2f5791ed6dfeafcca2e792fcf7571d9c50c9c1c9f2a571b97ef11a12136"} Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.833020 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fa4c2f5791ed6dfeafcca2e792fcf7571d9c50c9c1c9f2a571b97ef11a12136" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.833103 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7stdk" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.839440 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6x8cq" event={"ID":"99407a1e-403e-4460-8bff-8eb644010b4c","Type":"ContainerDied","Data":"c3eb5e5ed5403ffe30c6132a2206efd37536c5594917200c7d05cc3d92f38279"} Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.839533 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3eb5e5ed5403ffe30c6132a2206efd37536c5594917200c7d05cc3d92f38279" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.839643 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6x8cq" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.940482 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6ffd4ff45f-xttfr"] Feb 23 06:59:47 crc kubenswrapper[4626]: E0223 06:59:47.940993 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99407a1e-403e-4460-8bff-8eb644010b4c" containerName="keystone-bootstrap" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.941007 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="99407a1e-403e-4460-8bff-8eb644010b4c" containerName="keystone-bootstrap" Feb 23 06:59:47 crc kubenswrapper[4626]: E0223 06:59:47.941029 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498f5c1c-5f75-49e1-909a-e7ce904ebd9d" containerName="barbican-db-sync" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.941037 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="498f5c1c-5f75-49e1-909a-e7ce904ebd9d" containerName="barbican-db-sync" Feb 23 06:59:47 crc kubenswrapper[4626]: E0223 06:59:47.941047 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a203dc9f-43a6-4cf4-ac68-7c5125053cba" containerName="placement-db-sync" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.941054 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a203dc9f-43a6-4cf4-ac68-7c5125053cba" containerName="placement-db-sync" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.941259 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="498f5c1c-5f75-49e1-909a-e7ce904ebd9d" containerName="barbican-db-sync" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.941290 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="99407a1e-403e-4460-8bff-8eb644010b4c" containerName="keystone-bootstrap" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.941304 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a203dc9f-43a6-4cf4-ac68-7c5125053cba" containerName="placement-db-sync" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.942016 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.949613 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.949743 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.949658 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.949960 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-25kl2" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.949698 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 06:59:47 crc kubenswrapper[4626]: I0223 06:59:47.950218 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.009048 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6ffd4ff45f-xttfr"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.009158 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036171 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-scripts\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036240 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-combined-ca-bundle\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036301 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-fernet-keys\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036331 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-credential-keys\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036356 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6z6\" (UniqueName: \"kubernetes.io/projected/9f0e3a3c-7106-4f7e-af92-a329a82fc625-kube-api-access-fc6z6\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036433 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-internal-tls-certs\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036451 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-config-data\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.036470 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-public-tls-certs\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138229 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-internal-tls-certs\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138269 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-config-data\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138292 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-public-tls-certs\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138352 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-scripts\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138395 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-combined-ca-bundle\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138446 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-fernet-keys\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138473 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-credential-keys\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.138516 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6z6\" (UniqueName: \"kubernetes.io/projected/9f0e3a3c-7106-4f7e-af92-a329a82fc625-kube-api-access-fc6z6\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.143452 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-public-tls-certs\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.149004 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-credential-keys\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.149130 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-internal-tls-certs\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.149476 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-config-data\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.149959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-combined-ca-bundle\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.150059 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-fernet-keys\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.150418 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f0e3a3c-7106-4f7e-af92-a329a82fc625-scripts\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.153799 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6z6\" (UniqueName: \"kubernetes.io/projected/9f0e3a3c-7106-4f7e-af92-a329a82fc625-kube-api-access-fc6z6\") pod \"keystone-6ffd4ff45f-xttfr\" (UID: \"9f0e3a3c-7106-4f7e-af92-a329a82fc625\") " pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.267508 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.669084 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6495996568-xfqgf"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.670915 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.675159 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cjbtp" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.678808 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.678912 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.716857 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6495996568-xfqgf"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.751661 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-combined-ca-bundle\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.751965 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd22d53-a38c-4579-b6fd-e7934e32ca47-logs\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.752011 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-config-data\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.752080 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-config-data-custom\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.752210 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtmw9\" (UniqueName: \"kubernetes.io/projected/3bd22d53-a38c-4579-b6fd-e7934e32ca47-kube-api-access-rtmw9\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.854553 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd22d53-a38c-4579-b6fd-e7934e32ca47-logs\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.854618 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-config-data\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.854670 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-config-data-custom\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.854787 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtmw9\" (UniqueName: \"kubernetes.io/projected/3bd22d53-a38c-4579-b6fd-e7934e32ca47-kube-api-access-rtmw9\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.854849 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-combined-ca-bundle\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.862462 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd22d53-a38c-4579-b6fd-e7934e32ca47-logs\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.875526 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-combined-ca-bundle\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.891806 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-config-data-custom\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.911041 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtmw9\" (UniqueName: \"kubernetes.io/projected/3bd22d53-a38c-4579-b6fd-e7934e32ca47-kube-api-access-rtmw9\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.911114 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7ccd97cd69-bpkw8"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.928936 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.930904 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd22d53-a38c-4579-b6fd-e7934e32ca47-config-data\") pod \"barbican-keystone-listener-6495996568-xfqgf\" (UID: \"3bd22d53-a38c-4579-b6fd-e7934e32ca47\") " pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.951020 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.979548 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7ccd97cd69-bpkw8"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.996752 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd9b88b4f-724lk"] Feb 23 06:59:48 crc kubenswrapper[4626]: I0223 06:59:48.997011 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="dnsmasq-dns" containerID="cri-o://949ba8dbc8bc52479eeeff167ea47af212c50fdd1f26639d7198135fb37578a8" gracePeriod=10 Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.001196 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-96d55c7d9-glvbc"] Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.003102 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.003595 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96d55c7d9-glvbc"] Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.012034 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.013021 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6cb97bcbf6-sl6hx"] Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.017732 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6495996568-xfqgf" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.039957 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cb97bcbf6-sl6hx"] Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.040077 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.050277 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-864c594bfd-tq9x6"] Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.051648 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-864c594bfd-tq9x6"] Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.051730 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.052300 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.053046 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.053171 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dglfj" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.053270 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.053374 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.059195 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.063270 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-combined-ca-bundle\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.063352 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec78427-155b-4ed6-8d16-e56f099473c1-logs\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.063564 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-config-data\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.064310 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-config-data-custom\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.064364 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7h2\" (UniqueName: \"kubernetes.io/projected/6ec78427-155b-4ed6-8d16-e56f099473c1-kube-api-access-cr7h2\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.170075 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-config\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.170347 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-sb\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.170382 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-config-data-custom\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.170410 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7h2\" (UniqueName: \"kubernetes.io/projected/6ec78427-155b-4ed6-8d16-e56f099473c1-kube-api-access-cr7h2\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171342 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296f373f-42ac-474f-bc36-eab630843ed1-logs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171640 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-public-tls-certs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171689 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data-custom\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171721 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-svc\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171900 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171921 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa75531-7ff2-46e6-b665-9819933be8aa-logs\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171940 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-nb\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.171997 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-internal-tls-certs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.172048 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-swift-storage-0\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.172085 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-combined-ca-bundle\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.172989 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec78427-155b-4ed6-8d16-e56f099473c1-logs\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173091 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5j2\" (UniqueName: \"kubernetes.io/projected/dc2ac61b-c277-49f1-becb-040e73b53e8a-kube-api-access-zn5j2\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173147 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-scripts\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173191 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k622\" (UniqueName: \"kubernetes.io/projected/2fa75531-7ff2-46e6-b665-9819933be8aa-kube-api-access-7k622\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173209 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-combined-ca-bundle\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173234 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-combined-ca-bundle\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173252 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5twf\" (UniqueName: \"kubernetes.io/projected/296f373f-42ac-474f-bc36-eab630843ed1-kube-api-access-z5twf\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173286 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-config-data\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.173312 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-config-data\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.176023 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ec78427-155b-4ed6-8d16-e56f099473c1-logs\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.182236 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-config-data\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.190235 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7h2\" (UniqueName: \"kubernetes.io/projected/6ec78427-155b-4ed6-8d16-e56f099473c1-kube-api-access-cr7h2\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.192170 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-config-data-custom\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.193221 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ec78427-155b-4ed6-8d16-e56f099473c1-combined-ca-bundle\") pod \"barbican-worker-7ccd97cd69-bpkw8\" (UID: \"6ec78427-155b-4ed6-8d16-e56f099473c1\") " pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275193 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-config\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275245 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-sb\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275296 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296f373f-42ac-474f-bc36-eab630843ed1-logs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275366 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-public-tls-certs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275392 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data-custom\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275414 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-svc\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275522 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275545 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa75531-7ff2-46e6-b665-9819933be8aa-logs\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275569 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-nb\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275606 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-internal-tls-certs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275646 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-swift-storage-0\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275719 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5j2\" (UniqueName: \"kubernetes.io/projected/dc2ac61b-c277-49f1-becb-040e73b53e8a-kube-api-access-zn5j2\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275750 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-scripts\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275779 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k622\" (UniqueName: \"kubernetes.io/projected/2fa75531-7ff2-46e6-b665-9819933be8aa-kube-api-access-7k622\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275797 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-combined-ca-bundle\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275816 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-combined-ca-bundle\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275831 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5twf\" (UniqueName: \"kubernetes.io/projected/296f373f-42ac-474f-bc36-eab630843ed1-kube-api-access-z5twf\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.275858 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-config-data\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.276634 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296f373f-42ac-474f-bc36-eab630843ed1-logs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.277919 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-swift-storage-0\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.276634 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-config\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.279091 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-config-data\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.279300 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-internal-tls-certs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.281873 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-scripts\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.282178 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-svc\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.282354 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-sb\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.282464 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa75531-7ff2-46e6-b665-9819933be8aa-logs\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.282938 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-nb\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.283282 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data-custom\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.283985 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-combined-ca-bundle\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.291415 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k622\" (UniqueName: \"kubernetes.io/projected/2fa75531-7ff2-46e6-b665-9819933be8aa-kube-api-access-7k622\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.291977 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-combined-ca-bundle\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.294437 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f373f-42ac-474f-bc36-eab630843ed1-public-tls-certs\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.296164 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data\") pod \"barbican-api-864c594bfd-tq9x6\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.300151 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5j2\" (UniqueName: \"kubernetes.io/projected/dc2ac61b-c277-49f1-becb-040e73b53e8a-kube-api-access-zn5j2\") pod \"dnsmasq-dns-96d55c7d9-glvbc\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.302389 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5twf\" (UniqueName: \"kubernetes.io/projected/296f373f-42ac-474f-bc36-eab630843ed1-kube-api-access-z5twf\") pod \"placement-6cb97bcbf6-sl6hx\" (UID: \"296f373f-42ac-474f-bc36-eab630843ed1\") " pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.332182 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7ccd97cd69-bpkw8" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.427075 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.449175 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.474552 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.934070 4626 generic.go:334] "Generic (PLEG): container finished" podID="8dcb0164-5b5e-456b-9918-f42483a73601" containerID="949ba8dbc8bc52479eeeff167ea47af212c50fdd1f26639d7198135fb37578a8" exitCode=0 Feb 23 06:59:49 crc kubenswrapper[4626]: I0223 06:59:49.934120 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" event={"ID":"8dcb0164-5b5e-456b-9918-f42483a73601","Type":"ContainerDied","Data":"949ba8dbc8bc52479eeeff167ea47af212c50fdd1f26639d7198135fb37578a8"} Feb 23 06:59:50 crc kubenswrapper[4626]: I0223 06:59:50.460050 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.604759 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6958fdb966-vkk9n"] Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.607743 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.610106 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.616988 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.651111 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6958fdb966-vkk9n"] Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.737698 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-config-data\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.737745 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pppc9\" (UniqueName: \"kubernetes.io/projected/e591795d-67ce-48d5-a54e-2f989878eca9-kube-api-access-pppc9\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.737771 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-public-tls-certs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.738023 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-config-data-custom\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.738080 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-internal-tls-certs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.738164 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-combined-ca-bundle\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.738330 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e591795d-67ce-48d5-a54e-2f989878eca9-logs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840071 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-config-data-custom\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840388 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-internal-tls-certs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840442 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-combined-ca-bundle\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840567 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e591795d-67ce-48d5-a54e-2f989878eca9-logs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840624 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-config-data\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840649 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pppc9\" (UniqueName: \"kubernetes.io/projected/e591795d-67ce-48d5-a54e-2f989878eca9-kube-api-access-pppc9\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.840665 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-public-tls-certs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.841236 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e591795d-67ce-48d5-a54e-2f989878eca9-logs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.845959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-config-data-custom\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.854090 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-config-data\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.854933 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-combined-ca-bundle\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.861150 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pppc9\" (UniqueName: \"kubernetes.io/projected/e591795d-67ce-48d5-a54e-2f989878eca9-kube-api-access-pppc9\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.867850 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-public-tls-certs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.870704 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e591795d-67ce-48d5-a54e-2f989878eca9-internal-tls-certs\") pod \"barbican-api-6958fdb966-vkk9n\" (UID: \"e591795d-67ce-48d5-a54e-2f989878eca9\") " pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:51 crc kubenswrapper[4626]: I0223 06:59:51.925740 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:53 crc kubenswrapper[4626]: W0223 06:59:53.238667 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70eb0f85_ccb5_4ba7_b3bd_f586483ca336.slice/crio-c5518656b5fe263d3d3fe173e4f6fdabfe62cfa97711a06da45ab6efeeab0898 WatchSource:0}: Error finding container c5518656b5fe263d3d3fe173e4f6fdabfe62cfa97711a06da45ab6efeeab0898: Status 404 returned error can't find the container with id c5518656b5fe263d3d3fe173e4f6fdabfe62cfa97711a06da45ab6efeeab0898 Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.376187 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.479477 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-sb\") pod \"8dcb0164-5b5e-456b-9918-f42483a73601\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.479607 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-svc\") pod \"8dcb0164-5b5e-456b-9918-f42483a73601\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.479737 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-swift-storage-0\") pod \"8dcb0164-5b5e-456b-9918-f42483a73601\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.479779 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-nb\") pod \"8dcb0164-5b5e-456b-9918-f42483a73601\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.479938 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-config\") pod \"8dcb0164-5b5e-456b-9918-f42483a73601\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.479992 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8bq5\" (UniqueName: \"kubernetes.io/projected/8dcb0164-5b5e-456b-9918-f42483a73601-kube-api-access-l8bq5\") pod \"8dcb0164-5b5e-456b-9918-f42483a73601\" (UID: \"8dcb0164-5b5e-456b-9918-f42483a73601\") " Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.492487 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dcb0164-5b5e-456b-9918-f42483a73601-kube-api-access-l8bq5" (OuterVolumeSpecName: "kube-api-access-l8bq5") pod "8dcb0164-5b5e-456b-9918-f42483a73601" (UID: "8dcb0164-5b5e-456b-9918-f42483a73601"). InnerVolumeSpecName "kube-api-access-l8bq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.585144 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8bq5\" (UniqueName: \"kubernetes.io/projected/8dcb0164-5b5e-456b-9918-f42483a73601-kube-api-access-l8bq5\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.637671 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.648719 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8dcb0164-5b5e-456b-9918-f42483a73601" (UID: "8dcb0164-5b5e-456b-9918-f42483a73601"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.665356 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8dcb0164-5b5e-456b-9918-f42483a73601" (UID: "8dcb0164-5b5e-456b-9918-f42483a73601"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.674212 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8dcb0164-5b5e-456b-9918-f42483a73601" (UID: "8dcb0164-5b5e-456b-9918-f42483a73601"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.674779 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-config" (OuterVolumeSpecName: "config") pod "8dcb0164-5b5e-456b-9918-f42483a73601" (UID: "8dcb0164-5b5e-456b-9918-f42483a73601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.679947 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8dcb0164-5b5e-456b-9918-f42483a73601" (UID: "8dcb0164-5b5e-456b-9918-f42483a73601"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.687110 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.687159 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.687174 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.687186 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.687198 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dcb0164-5b5e-456b-9918-f42483a73601-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:53 crc kubenswrapper[4626]: I0223 06:59:53.810790 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-688bccf86-4crkw" podUID="d3e1e535-58de-4987-9d93-65fb6d4c9409" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.053775 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" event={"ID":"8dcb0164-5b5e-456b-9918-f42483a73601","Type":"ContainerDied","Data":"cbf0d9dee681d5e2aed577d3aae8d66585ce818c880d6712aedc117a58d8b0f4"} Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.053820 4626 scope.go:117] "RemoveContainer" containerID="949ba8dbc8bc52479eeeff167ea47af212c50fdd1f26639d7198135fb37578a8" Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.053923 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd9b88b4f-724lk" Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.062874 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"70eb0f85-ccb5-4ba7-b3bd-f586483ca336","Type":"ContainerStarted","Data":"c5518656b5fe263d3d3fe173e4f6fdabfe62cfa97711a06da45ab6efeeab0898"} Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.087598 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd9b88b4f-724lk"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.162985 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd9b88b4f-724lk"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.185857 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.224843 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6495996568-xfqgf"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.227968 4626 scope.go:117] "RemoveContainer" containerID="ad17177ef36fac13085c50e9ad24262a26c30c8a6c34e7e88ef3c219c3e64999" Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.371023 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cb97bcbf6-sl6hx"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.440399 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7ccd97cd69-bpkw8"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.636543 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-864c594bfd-tq9x6"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.649786 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96d55c7d9-glvbc"] Feb 23 06:59:54 crc kubenswrapper[4626]: W0223 06:59:54.671159 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fa75531_7ff2_46e6_b665_9819933be8aa.slice/crio-f5a87233f6b9fd401bb53bf9348bad49d855d50bb4e74b4e3c6e1f199a9e309d WatchSource:0}: Error finding container f5a87233f6b9fd401bb53bf9348bad49d855d50bb4e74b4e3c6e1f199a9e309d: Status 404 returned error can't find the container with id f5a87233f6b9fd401bb53bf9348bad49d855d50bb4e74b4e3c6e1f199a9e309d Feb 23 06:59:54 crc kubenswrapper[4626]: W0223 06:59:54.674752 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc2ac61b_c277_49f1_becb_040e73b53e8a.slice/crio-dd895b209a40d93d61118cfc2fdd62bb77ad1bcd68e7a36a7af0aabce84641c1 WatchSource:0}: Error finding container dd895b209a40d93d61118cfc2fdd62bb77ad1bcd68e7a36a7af0aabce84641c1: Status 404 returned error can't find the container with id dd895b209a40d93d61118cfc2fdd62bb77ad1bcd68e7a36a7af0aabce84641c1 Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.790727 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6958fdb966-vkk9n"] Feb 23 06:59:54 crc kubenswrapper[4626]: I0223 06:59:54.800967 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6ffd4ff45f-xttfr"] Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.089909 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cb97bcbf6-sl6hx" event={"ID":"296f373f-42ac-474f-bc36-eab630843ed1","Type":"ContainerStarted","Data":"26d6c54a4c166dc669ee77aa4acdd081c73aa1fbab65f454a7d11c3703ca5951"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.089964 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cb97bcbf6-sl6hx" event={"ID":"296f373f-42ac-474f-bc36-eab630843ed1","Type":"ContainerStarted","Data":"2cc733b6ee879dc5b921c9f11f29f6afd6919e620bf8c81c6f146f4cba01533b"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.097488 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"70eb0f85-ccb5-4ba7-b3bd-f586483ca336","Type":"ContainerStarted","Data":"7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.101723 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" event={"ID":"dc2ac61b-c277-49f1-becb-040e73b53e8a","Type":"ContainerStarted","Data":"dd895b209a40d93d61118cfc2fdd62bb77ad1bcd68e7a36a7af0aabce84641c1"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.104758 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58f4f2ad-1b9a-4be4-a535-32183ce254a6","Type":"ContainerStarted","Data":"8fecba7cfe5a4a4fe9099363fbd84a5c4ec664248fe5fc372a89f926d575fc94"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.111058 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fttdm" event={"ID":"c1806f1a-08dd-4b17-a799-1122348a4ab3","Type":"ContainerStarted","Data":"a4c287ec73a0eec2d6f0a4c85df408793997ca1b2130c7115cb3bc92e74a7ac7"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.128095 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ccd97cd69-bpkw8" event={"ID":"6ec78427-155b-4ed6-8d16-e56f099473c1","Type":"ContainerStarted","Data":"8046ddf80fa5f35988df0c65d78a46bce0e61f4f58e3ca43cfba713411372659"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.131798 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ffd4ff45f-xttfr" event={"ID":"9f0e3a3c-7106-4f7e-af92-a329a82fc625","Type":"ContainerStarted","Data":"6913f0464cb197dff32e3f3be504a5c0b3839d005b8a4afaaa78971111e1d2f5"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.134710 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fttdm" podStartSLOduration=2.9886467100000003 podStartE2EDuration="52.134696357s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="2026-02-23 06:59:04.675646867 +0000 UTC m=+1097.014976133" lastFinishedPulling="2026-02-23 06:59:53.821696513 +0000 UTC m=+1146.161025780" observedRunningTime="2026-02-23 06:59:55.128790143 +0000 UTC m=+1147.468119409" watchObservedRunningTime="2026-02-23 06:59:55.134696357 +0000 UTC m=+1147.474025624" Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.136012 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35","Type":"ContainerStarted","Data":"47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.146054 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6958fdb966-vkk9n" event={"ID":"e591795d-67ce-48d5-a54e-2f989878eca9","Type":"ContainerStarted","Data":"70a2651c8b3710aeb88f4b633afd7d7d13fc5efc5635aa486107aa024cb0c267"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.150838 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-864c594bfd-tq9x6" event={"ID":"2fa75531-7ff2-46e6-b665-9819933be8aa","Type":"ContainerStarted","Data":"7c93a647cb80655e652f053be44700434bf9f2752870478ab8b312a179de3ccc"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.150872 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-864c594bfd-tq9x6" event={"ID":"2fa75531-7ff2-46e6-b665-9819933be8aa","Type":"ContainerStarted","Data":"f5a87233f6b9fd401bb53bf9348bad49d855d50bb4e74b4e3c6e1f199a9e309d"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.152580 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6495996568-xfqgf" event={"ID":"3bd22d53-a38c-4579-b6fd-e7934e32ca47","Type":"ContainerStarted","Data":"1fdcc67c278e016d8e0bf3a01b5ae39d05c2398e41e3f511e5915bca50bda861"} Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.685969 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:59:55 crc kubenswrapper[4626]: I0223 06:59:55.686254 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.013820 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" path="/var/lib/kubelet/pods/8dcb0164-5b5e-456b-9918-f42483a73601/volumes" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.311384 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-864c594bfd-tq9x6" event={"ID":"2fa75531-7ff2-46e6-b665-9819933be8aa","Type":"ContainerStarted","Data":"e8561be61481e0551fb638547d7448bf977eadb55f2430eebc4850060359fd68"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.311705 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.311726 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.313965 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cb97bcbf6-sl6hx" event={"ID":"296f373f-42ac-474f-bc36-eab630843ed1","Type":"ContainerStarted","Data":"5685b12887d1df66d5e0c650dbd02447cab3c46982b85a0c52ec811905872281"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.314204 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.314719 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.349800 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ffd4ff45f-xttfr" event={"ID":"9f0e3a3c-7106-4f7e-af92-a329a82fc625","Type":"ContainerStarted","Data":"46c189895c78cb34a56626cb823ffe950caad3a598a00f20a7691d7f1b49532d"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.349890 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.369188 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-864c594bfd-tq9x6" podStartSLOduration=8.369172004 podStartE2EDuration="8.369172004s" podCreationTimestamp="2026-02-23 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:56.332303817 +0000 UTC m=+1148.671633084" watchObservedRunningTime="2026-02-23 06:59:56.369172004 +0000 UTC m=+1148.708501270" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.382472 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"70eb0f85-ccb5-4ba7-b3bd-f586483ca336","Type":"ContainerStarted","Data":"968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.404926 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6ffd4ff45f-xttfr" podStartSLOduration=9.404911332 podStartE2EDuration="9.404911332s" podCreationTimestamp="2026-02-23 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:56.403958185 +0000 UTC m=+1148.743287451" watchObservedRunningTime="2026-02-23 06:59:56.404911332 +0000 UTC m=+1148.744240598" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.409650 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6cb97bcbf6-sl6hx" podStartSLOduration=8.409641128 podStartE2EDuration="8.409641128s" podCreationTimestamp="2026-02-23 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:56.372230329 +0000 UTC m=+1148.711559595" watchObservedRunningTime="2026-02-23 06:59:56.409641128 +0000 UTC m=+1148.748970394" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.424587 4626 generic.go:334] "Generic (PLEG): container finished" podID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerID="494206962a10a823f9210af48fa50d5c7a69c60c982c62ba90c2962d8e61ecf5" exitCode=0 Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.424881 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" event={"ID":"dc2ac61b-c277-49f1-becb-040e73b53e8a","Type":"ContainerDied","Data":"494206962a10a823f9210af48fa50d5c7a69c60c982c62ba90c2962d8e61ecf5"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.437652 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.437639153 podStartE2EDuration="12.437639153s" podCreationTimestamp="2026-02-23 06:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:56.436159675 +0000 UTC m=+1148.775488941" watchObservedRunningTime="2026-02-23 06:59:56.437639153 +0000 UTC m=+1148.776968420" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.465130 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58f4f2ad-1b9a-4be4-a535-32183ce254a6","Type":"ContainerStarted","Data":"13b00ec4e33abde6af8fa25378a16255506bb30adff2ddb2d6b3866f55040d08"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.554480 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6958fdb966-vkk9n" event={"ID":"e591795d-67ce-48d5-a54e-2f989878eca9","Type":"ContainerStarted","Data":"3c8e1a77f6c3adc98c15601548e9ff3a4326ac1670ec9a1cdc5fdb7c2af817fc"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.558722 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6958fdb966-vkk9n" event={"ID":"e591795d-67ce-48d5-a54e-2f989878eca9","Type":"ContainerStarted","Data":"e3a9b991a2d0ab8efa7dd1334fbc0d13619cbc6c6687cba9cc9bfe9627f5c363"} Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.558814 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.558883 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 06:59:56 crc kubenswrapper[4626]: I0223 06:59:56.572907 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6958fdb966-vkk9n" podStartSLOduration=5.572891884 podStartE2EDuration="5.572891884s" podCreationTimestamp="2026-02-23 06:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:56.572618338 +0000 UTC m=+1148.911947604" watchObservedRunningTime="2026-02-23 06:59:56.572891884 +0000 UTC m=+1148.912221140" Feb 23 06:59:57 crc kubenswrapper[4626]: I0223 06:59:57.638387 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-47npm" event={"ID":"c5cc94ca-558e-4a2c-8d28-5aedbecb3090","Type":"ContainerStarted","Data":"a57d9b1f428f07b68f9321332ba192416d73ec5cf8cd22ee06bda8881d2a494c"} Feb 23 06:59:57 crc kubenswrapper[4626]: I0223 06:59:57.646751 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58f4f2ad-1b9a-4be4-a535-32183ce254a6","Type":"ContainerStarted","Data":"d8c9929a3dcc07967d2a2cb42b1477ead3837767b756bbeab0bfbc273efc72a5"} Feb 23 06:59:57 crc kubenswrapper[4626]: I0223 06:59:57.688221 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.688199581 podStartE2EDuration="13.688199581s" podCreationTimestamp="2026-02-23 06:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:57.682874062 +0000 UTC m=+1150.022203327" watchObservedRunningTime="2026-02-23 06:59:57.688199581 +0000 UTC m=+1150.027528846" Feb 23 06:59:57 crc kubenswrapper[4626]: I0223 06:59:57.731878 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-47npm" podStartSLOduration=3.963874655 podStartE2EDuration="54.731842416s" podCreationTimestamp="2026-02-23 06:59:03 +0000 UTC" firstStartedPulling="2026-02-23 06:59:04.978921634 +0000 UTC m=+1097.318250901" lastFinishedPulling="2026-02-23 06:59:55.746889396 +0000 UTC m=+1148.086218662" observedRunningTime="2026-02-23 06:59:57.700549289 +0000 UTC m=+1150.039878555" watchObservedRunningTime="2026-02-23 06:59:57.731842416 +0000 UTC m=+1150.071171682" Feb 23 06:59:58 crc kubenswrapper[4626]: I0223 06:59:58.663782 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" event={"ID":"dc2ac61b-c277-49f1-becb-040e73b53e8a","Type":"ContainerStarted","Data":"f9f57a3ef8c003cf3ddc4d939666fb8d9a093b48941450d1cf8e40cceff88fa3"} Feb 23 06:59:58 crc kubenswrapper[4626]: I0223 06:59:58.665821 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 06:59:58 crc kubenswrapper[4626]: I0223 06:59:58.671442 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6495996568-xfqgf" event={"ID":"3bd22d53-a38c-4579-b6fd-e7934e32ca47","Type":"ContainerStarted","Data":"678a31bab5b85b9b28b71c364101cb69a793ae365df8ef1a3676fc94d9d67fe5"} Feb 23 06:59:58 crc kubenswrapper[4626]: I0223 06:59:58.683953 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ccd97cd69-bpkw8" event={"ID":"6ec78427-155b-4ed6-8d16-e56f099473c1","Type":"ContainerStarted","Data":"ee876aba078ce2585d05db629bd39a592a81abfe648b2753778898d57c0c0a69"} Feb 23 06:59:58 crc kubenswrapper[4626]: I0223 06:59:58.689734 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" podStartSLOduration=10.689722363 podStartE2EDuration="10.689722363s" podCreationTimestamp="2026-02-23 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:58.683307981 +0000 UTC m=+1151.022637247" watchObservedRunningTime="2026-02-23 06:59:58.689722363 +0000 UTC m=+1151.029051628" Feb 23 06:59:59 crc kubenswrapper[4626]: I0223 06:59:59.697393 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6495996568-xfqgf" event={"ID":"3bd22d53-a38c-4579-b6fd-e7934e32ca47","Type":"ContainerStarted","Data":"30e0a086968582fd523a8eca745b8bbb2827c57c20bd0aa6f24f503ba44f6066"} Feb 23 06:59:59 crc kubenswrapper[4626]: I0223 06:59:59.700009 4626 generic.go:334] "Generic (PLEG): container finished" podID="c1806f1a-08dd-4b17-a799-1122348a4ab3" containerID="a4c287ec73a0eec2d6f0a4c85df408793997ca1b2130c7115cb3bc92e74a7ac7" exitCode=0 Feb 23 06:59:59 crc kubenswrapper[4626]: I0223 06:59:59.700136 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fttdm" event={"ID":"c1806f1a-08dd-4b17-a799-1122348a4ab3","Type":"ContainerDied","Data":"a4c287ec73a0eec2d6f0a4c85df408793997ca1b2130c7115cb3bc92e74a7ac7"} Feb 23 06:59:59 crc kubenswrapper[4626]: I0223 06:59:59.705944 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7ccd97cd69-bpkw8" event={"ID":"6ec78427-155b-4ed6-8d16-e56f099473c1","Type":"ContainerStarted","Data":"933cfbe39f9806ad13e81eb81d95f3c59ea24f6c839bfe572b8d3f43199f8f2d"} Feb 23 06:59:59 crc kubenswrapper[4626]: I0223 06:59:59.722681 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6495996568-xfqgf" podStartSLOduration=7.823707789 podStartE2EDuration="11.722668618s" podCreationTimestamp="2026-02-23 06:59:48 +0000 UTC" firstStartedPulling="2026-02-23 06:59:54.254992463 +0000 UTC m=+1146.594321729" lastFinishedPulling="2026-02-23 06:59:58.153953292 +0000 UTC m=+1150.493282558" observedRunningTime="2026-02-23 06:59:59.720740233 +0000 UTC m=+1152.060069499" watchObservedRunningTime="2026-02-23 06:59:59.722668618 +0000 UTC m=+1152.061997883" Feb 23 06:59:59 crc kubenswrapper[4626]: I0223 06:59:59.742327 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7ccd97cd69-bpkw8" podStartSLOduration=8.046563761 podStartE2EDuration="11.742309411s" podCreationTimestamp="2026-02-23 06:59:48 +0000 UTC" firstStartedPulling="2026-02-23 06:59:54.461478376 +0000 UTC m=+1146.800807643" lastFinishedPulling="2026-02-23 06:59:58.157224027 +0000 UTC m=+1150.496553293" observedRunningTime="2026-02-23 06:59:59.735120358 +0000 UTC m=+1152.074449624" watchObservedRunningTime="2026-02-23 06:59:59.742309411 +0000 UTC m=+1152.081638677" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.151168 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf"] Feb 23 07:00:00 crc kubenswrapper[4626]: E0223 07:00:00.152578 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="dnsmasq-dns" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.152693 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="dnsmasq-dns" Feb 23 07:00:00 crc kubenswrapper[4626]: E0223 07:00:00.152768 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="init" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.152838 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="init" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.153116 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dcb0164-5b5e-456b-9918-f42483a73601" containerName="dnsmasq-dns" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.155057 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.158270 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.158520 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.169582 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf"] Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.244820 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqnd7\" (UniqueName: \"kubernetes.io/projected/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-kube-api-access-sqnd7\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.244984 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-secret-volume\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.245129 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-config-volume\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.347795 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqnd7\" (UniqueName: \"kubernetes.io/projected/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-kube-api-access-sqnd7\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.347956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-secret-volume\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.348190 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-config-volume\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.349005 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-config-volume\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.355007 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-secret-volume\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.368075 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqnd7\" (UniqueName: \"kubernetes.io/projected/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-kube-api-access-sqnd7\") pod \"collect-profiles-29530500-w6dnf\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:00 crc kubenswrapper[4626]: I0223 07:00:00.474114 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.009686 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.237089 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-848f958bdf-dbqk8"] Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.241387 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-848f958bdf-dbqk8" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-api" containerID="cri-o://5910599f46b3fdc6c40589f88c6a1b675f88837cad419ce983039d97c9e5595b" gracePeriod=30 Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.241558 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-848f958bdf-dbqk8" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-httpd" containerID="cri-o://01305754c22a518c1d956f73ab61f2f9e3770c5759554be086ff146263c604af" gracePeriod=30 Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.262537 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-755f5b5889-45mmc"] Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.267554 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.284790 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-755f5b5889-45mmc"] Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.356135 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-848f958bdf-dbqk8" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": read tcp 10.217.0.2:55166->10.217.0.156:9696: read: connection reset by peer" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388485 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-httpd-config\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388619 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-public-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388648 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-internal-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388680 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-combined-ca-bundle\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388726 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-config\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388741 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-ovndb-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.388788 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjmpt\" (UniqueName: \"kubernetes.io/projected/3a761443-78ec-4b4c-8e91-fb2ff1061771-kube-api-access-rjmpt\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491431 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjmpt\" (UniqueName: \"kubernetes.io/projected/3a761443-78ec-4b4c-8e91-fb2ff1061771-kube-api-access-rjmpt\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491596 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-httpd-config\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491688 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-public-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491724 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-internal-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491791 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-combined-ca-bundle\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491886 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-config\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.491908 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-ovndb-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.503608 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-httpd-config\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.503859 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-public-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.504290 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-internal-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.505550 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-ovndb-tls-certs\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.508941 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-combined-ca-bundle\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.523129 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjmpt\" (UniqueName: \"kubernetes.io/projected/3a761443-78ec-4b4c-8e91-fb2ff1061771-kube-api-access-rjmpt\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.524028 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-config\") pod \"neutron-755f5b5889-45mmc\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.618571 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.740804 4626 generic.go:334] "Generic (PLEG): container finished" podID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" containerID="a57d9b1f428f07b68f9321332ba192416d73ec5cf8cd22ee06bda8881d2a494c" exitCode=0 Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.741052 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-47npm" event={"ID":"c5cc94ca-558e-4a2c-8d28-5aedbecb3090","Type":"ContainerDied","Data":"a57d9b1f428f07b68f9321332ba192416d73ec5cf8cd22ee06bda8881d2a494c"} Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.760407 4626 generic.go:334] "Generic (PLEG): container finished" podID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerID="01305754c22a518c1d956f73ab61f2f9e3770c5759554be086ff146263c604af" exitCode=0 Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.760567 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-848f958bdf-dbqk8" event={"ID":"68c3a490-db52-4fed-baf1-07c3cf9b06bc","Type":"ContainerDied","Data":"01305754c22a518c1d956f73ab61f2f9e3770c5759554be086ff146263c604af"} Feb 23 07:00:01 crc kubenswrapper[4626]: I0223 07:00:01.863245 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 07:00:02 crc kubenswrapper[4626]: I0223 07:00:02.742346 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-848f958bdf-dbqk8" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Feb 23 07:00:03 crc kubenswrapper[4626]: I0223 07:00:03.635759 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 23 07:00:03 crc kubenswrapper[4626]: I0223 07:00:03.783026 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-688bccf86-4crkw" podUID="d3e1e535-58de-4987-9d93-65fb6d4c9409" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Feb 23 07:00:03 crc kubenswrapper[4626]: I0223 07:00:03.792190 4626 generic.go:334] "Generic (PLEG): container finished" podID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerID="5910599f46b3fdc6c40589f88c6a1b675f88837cad419ce983039d97c9e5595b" exitCode=0 Feb 23 07:00:03 crc kubenswrapper[4626]: I0223 07:00:03.792248 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-848f958bdf-dbqk8" event={"ID":"68c3a490-db52-4fed-baf1-07c3cf9b06bc","Type":"ContainerDied","Data":"5910599f46b3fdc6c40589f88c6a1b675f88837cad419ce983039d97c9e5595b"} Feb 23 07:00:03 crc kubenswrapper[4626]: I0223 07:00:03.798675 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 07:00:03 crc kubenswrapper[4626]: I0223 07:00:03.958921 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.096296 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6958fdb966-vkk9n" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.185410 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-864c594bfd-tq9x6"] Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.429754 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.433942 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.433999 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.527148 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b7cb4c7b9-nj7qj"] Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.531511 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="dnsmasq-dns" containerID="cri-o://0edeaf807430d6d17d32e8a9f893fbccafc909cc904c36a4d892d919a528b6ff" gracePeriod=10 Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.559090 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.594327 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.812333 4626 generic.go:334] "Generic (PLEG): container finished" podID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerID="0edeaf807430d6d17d32e8a9f893fbccafc909cc904c36a4d892d919a528b6ff" exitCode=0 Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.812876 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" event={"ID":"6e24c5dc-1890-47c6-84c9-dfa3c3965c81","Type":"ContainerDied","Data":"0edeaf807430d6d17d32e8a9f893fbccafc909cc904c36a4d892d919a528b6ff"} Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.813364 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" containerID="cri-o://7c93a647cb80655e652f053be44700434bf9f2752870478ab8b312a179de3ccc" gracePeriod=30 Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.813586 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" containerID="cri-o://e8561be61481e0551fb638547d7448bf977eadb55f2430eebc4850060359fd68" gracePeriod=30 Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.813916 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.813945 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.829475 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.829519 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.829475 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Feb 23 07:00:04 crc kubenswrapper[4626]: I0223 07:00:04.829605 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.155018 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.155393 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.209995 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.252315 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.827146 4626 generic.go:334] "Generic (PLEG): container finished" podID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerID="7c93a647cb80655e652f053be44700434bf9f2752870478ab8b312a179de3ccc" exitCode=143 Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.828431 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-864c594bfd-tq9x6" event={"ID":"2fa75531-7ff2-46e6-b665-9819933be8aa","Type":"ContainerDied","Data":"7c93a647cb80655e652f053be44700434bf9f2752870478ab8b312a179de3ccc"} Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.828475 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:05 crc kubenswrapper[4626]: I0223 07:00:05.829170 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:06 crc kubenswrapper[4626]: I0223 07:00:06.838557 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:00:06 crc kubenswrapper[4626]: I0223 07:00:06.838594 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:00:07 crc kubenswrapper[4626]: I0223 07:00:07.853464 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:00:07 crc kubenswrapper[4626]: I0223 07:00:07.854073 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.322928 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-47npm" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.433422 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-scripts\") pod \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.442842 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-combined-ca-bundle\") pod \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.442944 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-config-data\") pod \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.442996 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4nmz\" (UniqueName: \"kubernetes.io/projected/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-kube-api-access-k4nmz\") pod \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.443129 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-db-sync-config-data\") pod \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.443168 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-etc-machine-id\") pod \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\" (UID: \"c5cc94ca-558e-4a2c-8d28-5aedbecb3090\") " Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.444607 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c5cc94ca-558e-4a2c-8d28-5aedbecb3090" (UID: "c5cc94ca-558e-4a2c-8d28-5aedbecb3090"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.457196 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-scripts" (OuterVolumeSpecName: "scripts") pod "c5cc94ca-558e-4a2c-8d28-5aedbecb3090" (UID: "c5cc94ca-558e-4a2c-8d28-5aedbecb3090"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.479692 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c5cc94ca-558e-4a2c-8d28-5aedbecb3090" (UID: "c5cc94ca-558e-4a2c-8d28-5aedbecb3090"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.481127 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-kube-api-access-k4nmz" (OuterVolumeSpecName: "kube-api-access-k4nmz") pod "c5cc94ca-558e-4a2c-8d28-5aedbecb3090" (UID: "c5cc94ca-558e-4a2c-8d28-5aedbecb3090"). InnerVolumeSpecName "kube-api-access-k4nmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.516646 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-config-data" (OuterVolumeSpecName: "config-data") pod "c5cc94ca-558e-4a2c-8d28-5aedbecb3090" (UID: "c5cc94ca-558e-4a2c-8d28-5aedbecb3090"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.516693 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5cc94ca-558e-4a2c-8d28-5aedbecb3090" (UID: "c5cc94ca-558e-4a2c-8d28-5aedbecb3090"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.547085 4626 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.547130 4626 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.547142 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.547156 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.547167 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.547177 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4nmz\" (UniqueName: \"kubernetes.io/projected/c5cc94ca-558e-4a2c-8d28-5aedbecb3090-kube-api-access-k4nmz\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.774604 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.862655 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.863599 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-47npm" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.867727 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-47npm" event={"ID":"c5cc94ca-558e-4a2c-8d28-5aedbecb3090","Type":"ContainerDied","Data":"3d0758074e2db9265400432017ff66539deb298128a7d170ba7cb48d029c0282"} Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.867791 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d0758074e2db9265400432017ff66539deb298128a7d170ba7cb48d029c0282" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.900726 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:00:08 crc kubenswrapper[4626]: I0223 07:00:08.900831 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.202441 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.287083 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.461146 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.661509 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:09 crc kubenswrapper[4626]: E0223 07:00:09.662112 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" containerName="cinder-db-sync" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.662197 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" containerName="cinder-db-sync" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.662446 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" containerName="cinder-db-sync" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.666469 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.670853 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.671108 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.671339 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x6xqg" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.675256 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.682944 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.726365 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855f884985-s9fqw"] Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.727700 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.745307 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855f884985-s9fqw"] Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782692 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782735 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-sb\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782811 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782853 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-swift-storage-0\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782923 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782958 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.782986 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5fp\" (UniqueName: \"kubernetes.io/projected/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-kube-api-access-zd5fp\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.783065 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.783242 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5fs\" (UniqueName: \"kubernetes.io/projected/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-kube-api-access-8f5fs\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.783365 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-nb\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.783400 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.783451 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-config\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885269 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5fs\" (UniqueName: \"kubernetes.io/projected/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-kube-api-access-8f5fs\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885354 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-nb\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885373 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885406 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-config\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885454 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885480 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-sb\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885558 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885599 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-swift-storage-0\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885667 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885695 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885729 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd5fp\" (UniqueName: \"kubernetes.io/projected/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-kube-api-access-zd5fp\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.885798 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.886608 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.887377 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-nb\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.890578 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.890798 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.891590 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-config\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.892039 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-swift-storage-0\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.892435 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-sb\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.899132 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.901364 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.902707 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.909580 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.915114 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.915826 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.919743 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5fs\" (UniqueName: \"kubernetes.io/projected/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-kube-api-access-8f5fs\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.923334 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd5fp\" (UniqueName: \"kubernetes.io/projected/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-kube-api-access-zd5fp\") pod \"dnsmasq-dns-855f884985-s9fqw\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.957317 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.988683 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data-custom\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.988732 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.988792 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0006614-96ac-4260-a959-a66db83df548-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.989316 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-scripts\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.989398 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.989431 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0006614-96ac-4260-a959-a66db83df548-logs\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.989703 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd2nz\" (UniqueName: \"kubernetes.io/projected/f0006614-96ac-4260-a959-a66db83df548-kube-api-access-bd2nz\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:09 crc kubenswrapper[4626]: I0223 07:00:09.994364 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.063046 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.091626 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.091916 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0006614-96ac-4260-a959-a66db83df548-logs\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.091991 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd2nz\" (UniqueName: \"kubernetes.io/projected/f0006614-96ac-4260-a959-a66db83df548-kube-api-access-bd2nz\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.092038 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data-custom\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.092063 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.092080 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0006614-96ac-4260-a959-a66db83df548-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.092133 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-scripts\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.092248 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0006614-96ac-4260-a959-a66db83df548-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.092299 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0006614-96ac-4260-a959-a66db83df548-logs\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.098807 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-scripts\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.099524 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data-custom\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.106080 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.107346 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.134862 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd2nz\" (UniqueName: \"kubernetes.io/projected/f0006614-96ac-4260-a959-a66db83df548-kube-api-access-bd2nz\") pod \"cinder-api-0\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.320312 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.716451 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fttdm" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.808081 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-config-data\") pod \"c1806f1a-08dd-4b17-a799-1122348a4ab3\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.808508 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqdg\" (UniqueName: \"kubernetes.io/projected/c1806f1a-08dd-4b17-a799-1122348a4ab3-kube-api-access-dnqdg\") pod \"c1806f1a-08dd-4b17-a799-1122348a4ab3\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.808713 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-combined-ca-bundle\") pod \"c1806f1a-08dd-4b17-a799-1122348a4ab3\" (UID: \"c1806f1a-08dd-4b17-a799-1122348a4ab3\") " Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.831669 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1806f1a-08dd-4b17-a799-1122348a4ab3-kube-api-access-dnqdg" (OuterVolumeSpecName: "kube-api-access-dnqdg") pod "c1806f1a-08dd-4b17-a799-1122348a4ab3" (UID: "c1806f1a-08dd-4b17-a799-1122348a4ab3"). InnerVolumeSpecName "kube-api-access-dnqdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.858641 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1806f1a-08dd-4b17-a799-1122348a4ab3" (UID: "c1806f1a-08dd-4b17-a799-1122348a4ab3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.914944 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.914973 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqdg\" (UniqueName: \"kubernetes.io/projected/c1806f1a-08dd-4b17-a799-1122348a4ab3-kube-api-access-dnqdg\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.923123 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fttdm" event={"ID":"c1806f1a-08dd-4b17-a799-1122348a4ab3","Type":"ContainerDied","Data":"6b802228811fe0e6d94adfd5a621794d41cd33fea5320d97866d8337dd8e20d3"} Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.923185 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b802228811fe0e6d94adfd5a621794d41cd33fea5320d97866d8337dd8e20d3" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.923324 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fttdm" Feb 23 07:00:10 crc kubenswrapper[4626]: I0223 07:00:10.991747 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-config-data" (OuterVolumeSpecName: "config-data") pod "c1806f1a-08dd-4b17-a799-1122348a4ab3" (UID: "c1806f1a-08dd-4b17-a799-1122348a4ab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.023407 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1806f1a-08dd-4b17-a799-1122348a4ab3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:11 crc kubenswrapper[4626]: E0223 07:00:11.909723 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e2e144ca85e32837f004ac72ecfd562c17d679df28915274bd03d0138b72d55c" Feb 23 07:00:11 crc kubenswrapper[4626]: E0223 07:00:11.910150 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e2e144ca85e32837f004ac72ecfd562c17d679df28915274bd03d0138b72d55c,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvp9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 07:00:11 crc kubenswrapper[4626]: E0223 07:00:11.912176 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.966993 4626 generic.go:334] "Generic (PLEG): container finished" podID="7b2634f7-a041-40b7-beb0-36e366627314" containerID="8a52ca9edb3dc9d8c1c17706c7dbbf5e4738155850985c8a92d027b4fb6091e8" exitCode=137 Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.967315 4626 generic.go:334] "Generic (PLEG): container finished" podID="7b2634f7-a041-40b7-beb0-36e366627314" containerID="fdbb1fd152f3bb632f553e9ac09fd4e0fea49c8112c24a80099edf080a6eeadb" exitCode=137 Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.967540 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="ceilometer-notification-agent" containerID="cri-o://085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e" gracePeriod=30 Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.967086 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59898cd8f5-xpkhf" event={"ID":"7b2634f7-a041-40b7-beb0-36e366627314","Type":"ContainerDied","Data":"8a52ca9edb3dc9d8c1c17706c7dbbf5e4738155850985c8a92d027b4fb6091e8"} Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.967689 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59898cd8f5-xpkhf" event={"ID":"7b2634f7-a041-40b7-beb0-36e366627314","Type":"ContainerDied","Data":"fdbb1fd152f3bb632f553e9ac09fd4e0fea49c8112c24a80099edf080a6eeadb"} Feb 23 07:00:11 crc kubenswrapper[4626]: I0223 07:00:11.968135 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="sg-core" containerID="cri-o://47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5" gracePeriod=30 Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.049606 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:12 crc kubenswrapper[4626]: E0223 07:00:12.259769 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cfd4279_edbd_4cc5_a2d6_130c0c4ccd35.slice/crio-47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.278780 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365675 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365749 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-config\") pod \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365781 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-sb\") pod \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365815 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-swift-storage-0\") pod \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365869 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-svc\") pod \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365917 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwxkz\" (UniqueName: \"kubernetes.io/projected/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-kube-api-access-cwxkz\") pod \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.365970 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-nb\") pod \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\" (UID: \"6e24c5dc-1890-47c6-84c9-dfa3c3965c81\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.403376 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:43932->10.217.0.164:9311: read: connection reset by peer" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.403859 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-864c594bfd-tq9x6" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:43938->10.217.0.164:9311: read: connection reset by peer" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.452080 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-kube-api-access-cwxkz" (OuterVolumeSpecName: "kube-api-access-cwxkz") pod "6e24c5dc-1890-47c6-84c9-dfa3c3965c81" (UID: "6e24c5dc-1890-47c6-84c9-dfa3c3965c81"). InnerVolumeSpecName "kube-api-access-cwxkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470263 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-httpd-config\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470568 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-config\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470634 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-internal-tls-certs\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470814 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-ovndb-tls-certs\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470835 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-public-tls-certs\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470863 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-combined-ca-bundle\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.470961 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl4bs\" (UniqueName: \"kubernetes.io/projected/68c3a490-db52-4fed-baf1-07c3cf9b06bc-kube-api-access-dl4bs\") pod \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\" (UID: \"68c3a490-db52-4fed-baf1-07c3cf9b06bc\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.471671 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwxkz\" (UniqueName: \"kubernetes.io/projected/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-kube-api-access-cwxkz\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.486417 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c3a490-db52-4fed-baf1-07c3cf9b06bc-kube-api-access-dl4bs" (OuterVolumeSpecName: "kube-api-access-dl4bs") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "kube-api-access-dl4bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.487201 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6e24c5dc-1890-47c6-84c9-dfa3c3965c81" (UID: "6e24c5dc-1890-47c6-84c9-dfa3c3965c81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.497642 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e24c5dc-1890-47c6-84c9-dfa3c3965c81" (UID: "6e24c5dc-1890-47c6-84c9-dfa3c3965c81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.545567 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.573530 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.573561 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl4bs\" (UniqueName: \"kubernetes.io/projected/68c3a490-db52-4fed-baf1-07c3cf9b06bc-kube-api-access-dl4bs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.573631 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.573642 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.593067 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e24c5dc-1890-47c6-84c9-dfa3c3965c81" (UID: "6e24c5dc-1890-47c6-84c9-dfa3c3965c81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.615249 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e24c5dc-1890-47c6-84c9-dfa3c3965c81" (UID: "6e24c5dc-1890-47c6-84c9-dfa3c3965c81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.641189 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-config" (OuterVolumeSpecName: "config") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.656108 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-config" (OuterVolumeSpecName: "config") pod "6e24c5dc-1890-47c6-84c9-dfa3c3965c81" (UID: "6e24c5dc-1890-47c6-84c9-dfa3c3965c81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.658874 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.676000 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.676024 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.676033 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e24c5dc-1890-47c6-84c9-dfa3c3965c81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.676041 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.688176 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.707670 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.720556 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.731923 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "68c3a490-db52-4fed-baf1-07c3cf9b06bc" (UID: "68c3a490-db52-4fed-baf1-07c3cf9b06bc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.777299 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b2634f7-a041-40b7-beb0-36e366627314-horizon-secret-key\") pod \"7b2634f7-a041-40b7-beb0-36e366627314\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.777375 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-scripts\") pod \"7b2634f7-a041-40b7-beb0-36e366627314\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.777522 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-config-data\") pod \"7b2634f7-a041-40b7-beb0-36e366627314\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.777661 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2634f7-a041-40b7-beb0-36e366627314-logs\") pod \"7b2634f7-a041-40b7-beb0-36e366627314\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.777738 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rch85\" (UniqueName: \"kubernetes.io/projected/7b2634f7-a041-40b7-beb0-36e366627314-kube-api-access-rch85\") pod \"7b2634f7-a041-40b7-beb0-36e366627314\" (UID: \"7b2634f7-a041-40b7-beb0-36e366627314\") " Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.778489 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b2634f7-a041-40b7-beb0-36e366627314-logs" (OuterVolumeSpecName: "logs") pod "7b2634f7-a041-40b7-beb0-36e366627314" (UID: "7b2634f7-a041-40b7-beb0-36e366627314"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.778573 4626 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.778587 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.778598 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.778607 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68c3a490-db52-4fed-baf1-07c3cf9b06bc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.783166 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2634f7-a041-40b7-beb0-36e366627314-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7b2634f7-a041-40b7-beb0-36e366627314" (UID: "7b2634f7-a041-40b7-beb0-36e366627314"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.786316 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2634f7-a041-40b7-beb0-36e366627314-kube-api-access-rch85" (OuterVolumeSpecName: "kube-api-access-rch85") pod "7b2634f7-a041-40b7-beb0-36e366627314" (UID: "7b2634f7-a041-40b7-beb0-36e366627314"). InnerVolumeSpecName "kube-api-access-rch85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.800001 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-scripts" (OuterVolumeSpecName: "scripts") pod "7b2634f7-a041-40b7-beb0-36e366627314" (UID: "7b2634f7-a041-40b7-beb0-36e366627314"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.808216 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-config-data" (OuterVolumeSpecName: "config-data") pod "7b2634f7-a041-40b7-beb0-36e366627314" (UID: "7b2634f7-a041-40b7-beb0-36e366627314"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.880482 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2634f7-a041-40b7-beb0-36e366627314-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.880918 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rch85\" (UniqueName: \"kubernetes.io/projected/7b2634f7-a041-40b7-beb0-36e366627314-kube-api-access-rch85\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.880941 4626 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b2634f7-a041-40b7-beb0-36e366627314-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.880957 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.880969 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b2634f7-a041-40b7-beb0-36e366627314-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.892123 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:12 crc kubenswrapper[4626]: I0223 07:00:12.920230 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.046922 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0006614-96ac-4260-a959-a66db83df548","Type":"ContainerStarted","Data":"7b3247c3ea281d8e7ce2b222eb29aed16fd28083763e6d4fa4d383c7565a291a"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.048437 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" event={"ID":"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0","Type":"ContainerStarted","Data":"ebbf7827dc7c44d2424516f35a234135ca31707166b7574995999911da2289bc"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.060415 4626 generic.go:334] "Generic (PLEG): container finished" podID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerID="e8561be61481e0551fb638547d7448bf977eadb55f2430eebc4850060359fd68" exitCode=0 Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.060491 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-864c594bfd-tq9x6" event={"ID":"2fa75531-7ff2-46e6-b665-9819933be8aa","Type":"ContainerDied","Data":"e8561be61481e0551fb638547d7448bf977eadb55f2430eebc4850060359fd68"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.063456 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-755f5b5889-45mmc"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.068208 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-848f958bdf-dbqk8" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.082095 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-848f958bdf-dbqk8" event={"ID":"68c3a490-db52-4fed-baf1-07c3cf9b06bc","Type":"ContainerDied","Data":"3090702eb32cae40cade39268a090cfd12ecbacef56880571515fee04f33a1bc"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.082152 4626 scope.go:117] "RemoveContainer" containerID="01305754c22a518c1d956f73ab61f2f9e3770c5759554be086ff146263c604af" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.107432 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59898cd8f5-xpkhf" event={"ID":"7b2634f7-a041-40b7-beb0-36e366627314","Type":"ContainerDied","Data":"d64251e4d7548754bd5c464a70b04207b29a39b0586369517ddd02d5032f50ac"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.107451 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59898cd8f5-xpkhf" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.133199 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" event={"ID":"6e24c5dc-1890-47c6-84c9-dfa3c3965c81","Type":"ContainerDied","Data":"18499cfc10f61394d878ca8729bbdd2bcffef2bf6e9cc58520125d11a71965d7"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.133309 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7cb4c7b9-nj7qj" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.144987 4626 generic.go:334] "Generic (PLEG): container finished" podID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerID="47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5" exitCode=2 Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.145115 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35","Type":"ContainerDied","Data":"47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5"} Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.151293 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855f884985-s9fqw"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.199870 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:13 crc kubenswrapper[4626]: W0223 07:00:13.214473 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a5d4b2b_46ce_4bef_a8f6_4b877b97b1a4.slice/crio-1e62560da99b919dc301d44628df2b82d5ce22edb4edc48133f1a8949cd82d4b WatchSource:0}: Error finding container 1e62560da99b919dc301d44628df2b82d5ce22edb4edc48133f1a8949cd82d4b: Status 404 returned error can't find the container with id 1e62560da99b919dc301d44628df2b82d5ce22edb4edc48133f1a8949cd82d4b Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.278240 4626 scope.go:117] "RemoveContainer" containerID="5910599f46b3fdc6c40589f88c6a1b675f88837cad419ce983039d97c9e5595b" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.341697 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.429405 4626 scope.go:117] "RemoveContainer" containerID="8a52ca9edb3dc9d8c1c17706c7dbbf5e4738155850985c8a92d027b4fb6091e8" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.432915 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-848f958bdf-dbqk8"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.447806 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-848f958bdf-dbqk8"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.460389 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59898cd8f5-xpkhf"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.471640 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59898cd8f5-xpkhf"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.478993 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b7cb4c7b9-nj7qj"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.483657 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b7cb4c7b9-nj7qj"] Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.504473 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa75531-7ff2-46e6-b665-9819933be8aa-logs\") pod \"2fa75531-7ff2-46e6-b665-9819933be8aa\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.505974 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fa75531-7ff2-46e6-b665-9819933be8aa-logs" (OuterVolumeSpecName: "logs") pod "2fa75531-7ff2-46e6-b665-9819933be8aa" (UID: "2fa75531-7ff2-46e6-b665-9819933be8aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.506594 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data\") pod \"2fa75531-7ff2-46e6-b665-9819933be8aa\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.506874 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-combined-ca-bundle\") pod \"2fa75531-7ff2-46e6-b665-9819933be8aa\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.507311 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k622\" (UniqueName: \"kubernetes.io/projected/2fa75531-7ff2-46e6-b665-9819933be8aa-kube-api-access-7k622\") pod \"2fa75531-7ff2-46e6-b665-9819933be8aa\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.507482 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data-custom\") pod \"2fa75531-7ff2-46e6-b665-9819933be8aa\" (UID: \"2fa75531-7ff2-46e6-b665-9819933be8aa\") " Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.509959 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa75531-7ff2-46e6-b665-9819933be8aa-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.512570 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa75531-7ff2-46e6-b665-9819933be8aa-kube-api-access-7k622" (OuterVolumeSpecName: "kube-api-access-7k622") pod "2fa75531-7ff2-46e6-b665-9819933be8aa" (UID: "2fa75531-7ff2-46e6-b665-9819933be8aa"). InnerVolumeSpecName "kube-api-access-7k622". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.513687 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2fa75531-7ff2-46e6-b665-9819933be8aa" (UID: "2fa75531-7ff2-46e6-b665-9819933be8aa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.540720 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fa75531-7ff2-46e6-b665-9819933be8aa" (UID: "2fa75531-7ff2-46e6-b665-9819933be8aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.547103 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data" (OuterVolumeSpecName: "config-data") pod "2fa75531-7ff2-46e6-b665-9819933be8aa" (UID: "2fa75531-7ff2-46e6-b665-9819933be8aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.613960 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.614005 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.614016 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa75531-7ff2-46e6-b665-9819933be8aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.614032 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k622\" (UniqueName: \"kubernetes.io/projected/2fa75531-7ff2-46e6-b665-9819933be8aa-kube-api-access-7k622\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.720303 4626 scope.go:117] "RemoveContainer" containerID="fdbb1fd152f3bb632f553e9ac09fd4e0fea49c8112c24a80099edf080a6eeadb" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.787991 4626 scope.go:117] "RemoveContainer" containerID="0edeaf807430d6d17d32e8a9f893fbccafc909cc904c36a4d892d919a528b6ff" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.868408 4626 scope.go:117] "RemoveContainer" containerID="8579c9d227d2efe45662504b8ce7546b5d19b652804abcc8283456760af039b2" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.998958 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" path="/var/lib/kubelet/pods/68c3a490-db52-4fed-baf1-07c3cf9b06bc/volumes" Feb 23 07:00:13 crc kubenswrapper[4626]: I0223 07:00:13.999583 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" path="/var/lib/kubelet/pods/6e24c5dc-1890-47c6-84c9-dfa3c3965c81/volumes" Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.000209 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2634f7-a041-40b7-beb0-36e366627314" path="/var/lib/kubelet/pods/7b2634f7-a041-40b7-beb0-36e366627314/volumes" Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.183028 4626 generic.go:334] "Generic (PLEG): container finished" podID="8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" containerID="de8772856d5052fc2b5063c72cb46e6bbfc902f8ffc0916450590bcafe4071cd" exitCode=0 Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.183185 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" event={"ID":"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0","Type":"ContainerDied","Data":"de8772856d5052fc2b5063c72cb46e6bbfc902f8ffc0916450590bcafe4071cd"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.189615 4626 generic.go:334] "Generic (PLEG): container finished" podID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerID="9c4b8ccbf45740b21ab8ccd4f79dd6a7a91455b87dab4dddeeea90395c637192" exitCode=0 Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.189715 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855f884985-s9fqw" event={"ID":"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7","Type":"ContainerDied","Data":"9c4b8ccbf45740b21ab8ccd4f79dd6a7a91455b87dab4dddeeea90395c637192"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.189751 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855f884985-s9fqw" event={"ID":"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7","Type":"ContainerStarted","Data":"a5385bc3d17051e665df6c43a7859dbd071711e927594df673e814543c92b35e"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.221030 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755f5b5889-45mmc" event={"ID":"3a761443-78ec-4b4c-8e91-fb2ff1061771","Type":"ContainerStarted","Data":"3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.221070 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755f5b5889-45mmc" event={"ID":"3a761443-78ec-4b4c-8e91-fb2ff1061771","Type":"ContainerStarted","Data":"bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.221083 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755f5b5889-45mmc" event={"ID":"3a761443-78ec-4b4c-8e91-fb2ff1061771","Type":"ContainerStarted","Data":"336a7ee4f8ad0ead41b49175f499b39c3525a3c4be66604e7f1489059d118650"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.221757 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.230833 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0006614-96ac-4260-a959-a66db83df548","Type":"ContainerStarted","Data":"87705677b2dfb886b96b1b16b201b53deaeba92b52afba2c8420755e4fca47b0"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.252880 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-755f5b5889-45mmc" podStartSLOduration=13.252866422 podStartE2EDuration="13.252866422s" podCreationTimestamp="2026-02-23 07:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:14.249470331 +0000 UTC m=+1166.588799597" watchObservedRunningTime="2026-02-23 07:00:14.252866422 +0000 UTC m=+1166.592195688" Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.262058 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-864c594bfd-tq9x6" event={"ID":"2fa75531-7ff2-46e6-b665-9819933be8aa","Type":"ContainerDied","Data":"f5a87233f6b9fd401bb53bf9348bad49d855d50bb4e74b4e3c6e1f199a9e309d"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.262106 4626 scope.go:117] "RemoveContainer" containerID="e8561be61481e0551fb638547d7448bf977eadb55f2430eebc4850060359fd68" Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.262200 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-864c594bfd-tq9x6" Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.264712 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4","Type":"ContainerStarted","Data":"1e62560da99b919dc301d44628df2b82d5ce22edb4edc48133f1a8949cd82d4b"} Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.289197 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-864c594bfd-tq9x6"] Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.294737 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-864c594bfd-tq9x6"] Feb 23 07:00:14 crc kubenswrapper[4626]: I0223 07:00:14.349051 4626 scope.go:117] "RemoveContainer" containerID="7c93a647cb80655e652f053be44700434bf9f2752870478ab8b312a179de3ccc" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.315948 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0006614-96ac-4260-a959-a66db83df548","Type":"ContainerStarted","Data":"0f7e9544924a8de544e50ff8752bbcc36e4412b8d98d51da038f2153fa68a305"} Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.316728 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api-log" containerID="cri-o://87705677b2dfb886b96b1b16b201b53deaeba92b52afba2c8420755e4fca47b0" gracePeriod=30 Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.316846 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.316873 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api" containerID="cri-o://0f7e9544924a8de544e50ff8752bbcc36e4412b8d98d51da038f2153fa68a305" gracePeriod=30 Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.328576 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4","Type":"ContainerStarted","Data":"57c40db0aac8894918eac5a5576697f550611b6b700906b05689ccf45e6659fe"} Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.332395 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855f884985-s9fqw" event={"ID":"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7","Type":"ContainerStarted","Data":"2088a08dd29b6e6be47be4d391ea0ef87aa51a3b896e88596e2767a0074ae42b"} Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.333051 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.351853 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.351831973 podStartE2EDuration="6.351831973s" podCreationTimestamp="2026-02-23 07:00:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:15.34963919 +0000 UTC m=+1167.688968456" watchObservedRunningTime="2026-02-23 07:00:15.351831973 +0000 UTC m=+1167.691161239" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.395424 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-855f884985-s9fqw" podStartSLOduration=6.395400279 podStartE2EDuration="6.395400279s" podCreationTimestamp="2026-02-23 07:00:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:15.37384088 +0000 UTC m=+1167.713170146" watchObservedRunningTime="2026-02-23 07:00:15.395400279 +0000 UTC m=+1167.734729545" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.752693 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.832121 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.879619 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-config-volume\") pod \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.879828 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqnd7\" (UniqueName: \"kubernetes.io/projected/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-kube-api-access-sqnd7\") pod \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.880147 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-secret-volume\") pod \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\" (UID: \"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.880691 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" (UID: "8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.881553 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.886231 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" (UID: "8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.886259 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-kube-api-access-sqnd7" (OuterVolumeSpecName: "kube-api-access-sqnd7") pod "8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" (UID: "8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0"). InnerVolumeSpecName "kube-api-access-sqnd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.982910 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-config-data\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.983162 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-scripts\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.983255 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-sg-core-conf-yaml\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.983409 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvp9n\" (UniqueName: \"kubernetes.io/projected/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-kube-api-access-hvp9n\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.983563 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-combined-ca-bundle\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.983644 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-log-httpd\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.983684 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-run-httpd\") pod \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\" (UID: \"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35\") " Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.984665 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.984691 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqnd7\" (UniqueName: \"kubernetes.io/projected/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0-kube-api-access-sqnd7\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.985000 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.985732 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.993999 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-kube-api-access-hvp9n" (OuterVolumeSpecName: "kube-api-access-hvp9n") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "kube-api-access-hvp9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:15 crc kubenswrapper[4626]: I0223 07:00:15.997192 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-scripts" (OuterVolumeSpecName: "scripts") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.006134 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" path="/var/lib/kubelet/pods/2fa75531-7ff2-46e6-b665-9819933be8aa/volumes" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.026175 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-config-data" (OuterVolumeSpecName: "config-data") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.026201 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.038600 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" (UID: "1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088013 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088040 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088054 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvp9n\" (UniqueName: \"kubernetes.io/projected/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-kube-api-access-hvp9n\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088064 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088073 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088082 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.088090 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.348828 4626 generic.go:334] "Generic (PLEG): container finished" podID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerID="085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e" exitCode=0 Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.348960 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35","Type":"ContainerDied","Data":"085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e"} Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.348994 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35","Type":"ContainerDied","Data":"8e0c2b77838f707f9dd1000102c2d5bd2b63b4d219f258702d48c06e48f86f50"} Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.349016 4626 scope.go:117] "RemoveContainer" containerID="47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.349159 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.360476 4626 generic.go:334] "Generic (PLEG): container finished" podID="f0006614-96ac-4260-a959-a66db83df548" containerID="87705677b2dfb886b96b1b16b201b53deaeba92b52afba2c8420755e4fca47b0" exitCode=143 Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.360541 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0006614-96ac-4260-a959-a66db83df548","Type":"ContainerDied","Data":"87705677b2dfb886b96b1b16b201b53deaeba92b52afba2c8420755e4fca47b0"} Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.364405 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4","Type":"ContainerStarted","Data":"e986a7e2ce13e9083c49c1ebb06380febd8bf801832f5a1fbba2842bc0dde938"} Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.379588 4626 scope.go:117] "RemoveContainer" containerID="085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.384181 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" event={"ID":"8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0","Type":"ContainerDied","Data":"ebbf7827dc7c44d2424516f35a234135ca31707166b7574995999911da2289bc"} Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.384234 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebbf7827dc7c44d2424516f35a234135ca31707166b7574995999911da2289bc" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.384192 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.414954 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.236855784 podStartE2EDuration="7.414935896s" podCreationTimestamp="2026-02-23 07:00:09 +0000 UTC" firstStartedPulling="2026-02-23 07:00:13.224609175 +0000 UTC m=+1165.563938441" lastFinishedPulling="2026-02-23 07:00:14.402689297 +0000 UTC m=+1166.742018553" observedRunningTime="2026-02-23 07:00:16.394076245 +0000 UTC m=+1168.733405510" watchObservedRunningTime="2026-02-23 07:00:16.414935896 +0000 UTC m=+1168.754265162" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.425635 4626 scope.go:117] "RemoveContainer" containerID="47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.426147 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5\": container with ID starting with 47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5 not found: ID does not exist" containerID="47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.426179 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5"} err="failed to get container status \"47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5\": rpc error: code = NotFound desc = could not find container \"47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5\": container with ID starting with 47508dfbb03e30e2112ce7aa5e7a2e3b942d5b84f3c933472f2286d1d2ea83a5 not found: ID does not exist" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.426200 4626 scope.go:117] "RemoveContainer" containerID="085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.426430 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e\": container with ID starting with 085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e not found: ID does not exist" containerID="085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.426452 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e"} err="failed to get container status \"085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e\": rpc error: code = NotFound desc = could not find container \"085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e\": container with ID starting with 085b1168813a628d3720fe7175860e6066413ddcae2b3992af3d719972aef78e not found: ID does not exist" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.442943 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.448079 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467169 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467595 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="init" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467611 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="init" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467628 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1806f1a-08dd-4b17-a799-1122348a4ab3" containerName="heat-db-sync" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467636 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1806f1a-08dd-4b17-a799-1122348a4ab3" containerName="heat-db-sync" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467655 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467661 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467670 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467675 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467682 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon-log" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467687 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon-log" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467693 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="sg-core" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467698 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="sg-core" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467708 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="ceilometer-notification-agent" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467713 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="ceilometer-notification-agent" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467722 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" containerName="collect-profiles" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467727 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" containerName="collect-profiles" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467737 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-api" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467743 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-api" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467753 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467758 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467766 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="dnsmasq-dns" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467771 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="dnsmasq-dns" Feb 23 07:00:16 crc kubenswrapper[4626]: E0223 07:00:16.467779 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-httpd" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.467785 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-httpd" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473721 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api-log" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473743 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473757 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e24c5dc-1890-47c6-84c9-dfa3c3965c81" containerName="dnsmasq-dns" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473767 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa75531-7ff2-46e6-b665-9819933be8aa" containerName="barbican-api" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473773 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="ceilometer-notification-agent" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473785 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-httpd" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473800 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" containerName="collect-profiles" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473808 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" containerName="sg-core" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473815 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1806f1a-08dd-4b17-a799-1122348a4ab3" containerName="heat-db-sync" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473829 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2634f7-a041-40b7-beb0-36e366627314" containerName="horizon-log" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.473848 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c3a490-db52-4fed-baf1-07c3cf9b06bc" containerName="neutron-api" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.475253 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.480819 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.481057 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.512836 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.608875 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-config-data\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.609040 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-log-httpd\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.609135 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-scripts\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.609297 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.609395 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-run-httpd\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.609569 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prg8\" (UniqueName: \"kubernetes.io/projected/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-kube-api-access-7prg8\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.609684 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.687545 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-688bccf86-4crkw" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.713340 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-scripts\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.713549 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.713642 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-run-httpd\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.713720 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prg8\" (UniqueName: \"kubernetes.io/projected/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-kube-api-access-7prg8\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.713810 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.713944 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-config-data\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.714050 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-log-httpd\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.714689 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-run-httpd\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.715461 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-log-httpd\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.718142 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.718957 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-scripts\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.727913 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.730619 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-config-data\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.731415 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prg8\" (UniqueName: \"kubernetes.io/projected/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-kube-api-access-7prg8\") pod \"ceilometer-0\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " pod="openstack/ceilometer-0" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.754172 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 07:00:16 crc kubenswrapper[4626]: I0223 07:00:16.801091 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:17 crc kubenswrapper[4626]: I0223 07:00:17.283341 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:17 crc kubenswrapper[4626]: I0223 07:00:17.396610 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerStarted","Data":"6fb4920403eb87fd81c769722049cfd430f8ec450bf1e29a6b922b78d005a8c2"} Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.011733 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35" path="/var/lib/kubelet/pods/1cfd4279-edbd-4cc5-a2d6-130c0c4ccd35/volumes" Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.312110 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.410960 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-688bccf86-4crkw" Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.468091 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerStarted","Data":"ef878a1d1d88bf9bd3e350eeb007136b49032f622448769c43e8400b5a7417b4"} Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.511838 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9c5c7b856-snkxr"] Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.512093 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon-log" containerID="cri-o://a5a014efa76b7de50de6290e6814485f21418749702dd366d652bb5a811d9fcf" gracePeriod=30 Feb 23 07:00:18 crc kubenswrapper[4626]: I0223 07:00:18.512286 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" containerID="cri-o://60f2992022985555c9b19a0c7fb549a7d54427452a1fd539453a54ef31a10a6b" gracePeriod=30 Feb 23 07:00:19 crc kubenswrapper[4626]: I0223 07:00:19.486534 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerStarted","Data":"4be2ac2b8a0146115cfafe4cf8e049c760b81916f143932060e48d95711b3bb8"} Feb 23 07:00:19 crc kubenswrapper[4626]: I0223 07:00:19.995440 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.065684 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.134320 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96d55c7d9-glvbc"] Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.134856 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerName="dnsmasq-dns" containerID="cri-o://f9f57a3ef8c003cf3ddc4d939666fb8d9a093b48941450d1cf8e40cceff88fa3" gracePeriod=10 Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.570332 4626 generic.go:334] "Generic (PLEG): container finished" podID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerID="f9f57a3ef8c003cf3ddc4d939666fb8d9a093b48941450d1cf8e40cceff88fa3" exitCode=0 Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.570714 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" event={"ID":"dc2ac61b-c277-49f1-becb-040e73b53e8a","Type":"ContainerDied","Data":"f9f57a3ef8c003cf3ddc4d939666fb8d9a093b48941450d1cf8e40cceff88fa3"} Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.594880 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerStarted","Data":"e2462c86e98b963cae55f557dabd96a717fe6c2639e81395164dba612ac53c59"} Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.731898 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.845541 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-config\") pod \"dc2ac61b-c277-49f1-becb-040e73b53e8a\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.845679 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-swift-storage-0\") pod \"dc2ac61b-c277-49f1-becb-040e73b53e8a\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.845732 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-svc\") pod \"dc2ac61b-c277-49f1-becb-040e73b53e8a\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.845810 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-nb\") pod \"dc2ac61b-c277-49f1-becb-040e73b53e8a\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.845830 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-sb\") pod \"dc2ac61b-c277-49f1-becb-040e73b53e8a\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.845961 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn5j2\" (UniqueName: \"kubernetes.io/projected/dc2ac61b-c277-49f1-becb-040e73b53e8a-kube-api-access-zn5j2\") pod \"dc2ac61b-c277-49f1-becb-040e73b53e8a\" (UID: \"dc2ac61b-c277-49f1-becb-040e73b53e8a\") " Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.880701 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2ac61b-c277-49f1-becb-040e73b53e8a-kube-api-access-zn5j2" (OuterVolumeSpecName: "kube-api-access-zn5j2") pod "dc2ac61b-c277-49f1-becb-040e73b53e8a" (UID: "dc2ac61b-c277-49f1-becb-040e73b53e8a"). InnerVolumeSpecName "kube-api-access-zn5j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.957991 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn5j2\" (UniqueName: \"kubernetes.io/projected/dc2ac61b-c277-49f1-becb-040e73b53e8a-kube-api-access-zn5j2\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.963217 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc2ac61b-c277-49f1-becb-040e73b53e8a" (UID: "dc2ac61b-c277-49f1-becb-040e73b53e8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.978665 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dc2ac61b-c277-49f1-becb-040e73b53e8a" (UID: "dc2ac61b-c277-49f1-becb-040e73b53e8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.981872 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-config" (OuterVolumeSpecName: "config") pod "dc2ac61b-c277-49f1-becb-040e73b53e8a" (UID: "dc2ac61b-c277-49f1-becb-040e73b53e8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:20 crc kubenswrapper[4626]: I0223 07:00:20.993284 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc2ac61b-c277-49f1-becb-040e73b53e8a" (UID: "dc2ac61b-c277-49f1-becb-040e73b53e8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.005936 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc2ac61b-c277-49f1-becb-040e73b53e8a" (UID: "dc2ac61b-c277-49f1-becb-040e73b53e8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.060428 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.060811 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.060827 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.060837 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.060848 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc2ac61b-c277-49f1-becb-040e73b53e8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.628894 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" event={"ID":"dc2ac61b-c277-49f1-becb-040e73b53e8a","Type":"ContainerDied","Data":"dd895b209a40d93d61118cfc2fdd62bb77ad1bcd68e7a36a7af0aabce84641c1"} Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.628980 4626 scope.go:117] "RemoveContainer" containerID="f9f57a3ef8c003cf3ddc4d939666fb8d9a093b48941450d1cf8e40cceff88fa3" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.629129 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d55c7d9-glvbc" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.644003 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerStarted","Data":"38568a9892113266e45e1d53968595da05aefc6cde23102e7c8d0ade9d7d77f2"} Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.644816 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.675742 4626 scope.go:117] "RemoveContainer" containerID="494206962a10a823f9210af48fa50d5c7a69c60c982c62ba90c2962d8e61ecf5" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.681795 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96d55c7d9-glvbc"] Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.689397 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-96d55c7d9-glvbc"] Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.700964 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9634429180000001 podStartE2EDuration="5.700953002s" podCreationTimestamp="2026-02-23 07:00:16 +0000 UTC" firstStartedPulling="2026-02-23 07:00:17.287699397 +0000 UTC m=+1169.627028663" lastFinishedPulling="2026-02-23 07:00:21.025209481 +0000 UTC m=+1173.364538747" observedRunningTime="2026-02-23 07:00:21.697297201 +0000 UTC m=+1174.036626468" watchObservedRunningTime="2026-02-23 07:00:21.700953002 +0000 UTC m=+1174.040282259" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.942261 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.950470 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6ffd4ff45f-xttfr" Feb 23 07:00:21 crc kubenswrapper[4626]: I0223 07:00:21.991958 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" path="/var/lib/kubelet/pods/dc2ac61b-c277-49f1-becb-040e73b53e8a/volumes" Feb 23 07:00:22 crc kubenswrapper[4626]: I0223 07:00:22.020570 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cb97bcbf6-sl6hx" Feb 23 07:00:22 crc kubenswrapper[4626]: I0223 07:00:22.653374 4626 generic.go:334] "Generic (PLEG): container finished" podID="8624d986-dff6-40bd-937d-755c2ca809d9" containerID="60f2992022985555c9b19a0c7fb549a7d54427452a1fd539453a54ef31a10a6b" exitCode=0 Feb 23 07:00:22 crc kubenswrapper[4626]: I0223 07:00:22.653447 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c5c7b856-snkxr" event={"ID":"8624d986-dff6-40bd-937d-755c2ca809d9","Type":"ContainerDied","Data":"60f2992022985555c9b19a0c7fb549a7d54427452a1fd539453a54ef31a10a6b"} Feb 23 07:00:23 crc kubenswrapper[4626]: I0223 07:00:23.634747 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.342016 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 07:00:24 crc kubenswrapper[4626]: E0223 07:00:24.342717 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerName="dnsmasq-dns" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.342731 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerName="dnsmasq-dns" Feb 23 07:00:24 crc kubenswrapper[4626]: E0223 07:00:24.342745 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerName="init" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.342751 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerName="init" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.342945 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc2ac61b-c277-49f1-becb-040e73b53e8a" containerName="dnsmasq-dns" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.343568 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.348843 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.348964 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q5wjv" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.350363 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.374235 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.379814 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.464818 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36431530-ec45-4670-bc2c-ababbf867d6f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.464979 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36431530-ec45-4670-bc2c-ababbf867d6f-openstack-config\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.465077 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5cr\" (UniqueName: \"kubernetes.io/projected/36431530-ec45-4670-bc2c-ababbf867d6f-kube-api-access-qv5cr\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.465415 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36431530-ec45-4670-bc2c-ababbf867d6f-openstack-config-secret\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.567776 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36431530-ec45-4670-bc2c-ababbf867d6f-openstack-config-secret\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.567892 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36431530-ec45-4670-bc2c-ababbf867d6f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.567993 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36431530-ec45-4670-bc2c-ababbf867d6f-openstack-config\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.568066 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5cr\" (UniqueName: \"kubernetes.io/projected/36431530-ec45-4670-bc2c-ababbf867d6f-kube-api-access-qv5cr\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.568893 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36431530-ec45-4670-bc2c-ababbf867d6f-openstack-config\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.575936 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36431530-ec45-4670-bc2c-ababbf867d6f-openstack-config-secret\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.576086 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36431530-ec45-4670-bc2c-ababbf867d6f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.612302 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5cr\" (UniqueName: \"kubernetes.io/projected/36431530-ec45-4670-bc2c-ababbf867d6f-kube-api-access-qv5cr\") pod \"openstackclient\" (UID: \"36431530-ec45-4670-bc2c-ababbf867d6f\") " pod="openstack/openstackclient" Feb 23 07:00:24 crc kubenswrapper[4626]: I0223 07:00:24.666897 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.224880 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.286977 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.324315 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.684897 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="cinder-scheduler" containerID="cri-o://57c40db0aac8894918eac5a5576697f550611b6b700906b05689ccf45e6659fe" gracePeriod=30 Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.685290 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"36431530-ec45-4670-bc2c-ababbf867d6f","Type":"ContainerStarted","Data":"cc5f314367213e8f509474ae462d3f1d2404fd0d6d23774f7c5bc76ed918ffd3"} Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.685529 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="probe" containerID="cri-o://e986a7e2ce13e9083c49c1ebb06380febd8bf801832f5a1fbba2842bc0dde938" gracePeriod=30 Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.686214 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:00:25 crc kubenswrapper[4626]: I0223 07:00:25.686255 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:00:26 crc kubenswrapper[4626]: I0223 07:00:26.706385 4626 generic.go:334] "Generic (PLEG): container finished" podID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerID="e986a7e2ce13e9083c49c1ebb06380febd8bf801832f5a1fbba2842bc0dde938" exitCode=0 Feb 23 07:00:26 crc kubenswrapper[4626]: I0223 07:00:26.706436 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4","Type":"ContainerDied","Data":"e986a7e2ce13e9083c49c1ebb06380febd8bf801832f5a1fbba2842bc0dde938"} Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.230169 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-655f59485b-t5d4q"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.231414 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.238819 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-x72r5" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.239030 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.239037 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.240311 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data-custom\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.240509 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.240568 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-combined-ca-bundle\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.240647 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95rg\" (UniqueName: \"kubernetes.io/projected/774d5101-24e2-4871-9a1a-f136698cf092-kube-api-access-s95rg\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.251559 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-655f59485b-t5d4q"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.343759 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.343838 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-combined-ca-bundle\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.343898 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s95rg\" (UniqueName: \"kubernetes.io/projected/774d5101-24e2-4871-9a1a-f136698cf092-kube-api-access-s95rg\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.343999 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data-custom\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.360209 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-combined-ca-bundle\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.360728 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data-custom\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.382526 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.395183 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95rg\" (UniqueName: \"kubernetes.io/projected/774d5101-24e2-4871-9a1a-f136698cf092-kube-api-access-s95rg\") pod \"heat-engine-655f59485b-t5d4q\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.555066 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-545447d6fb-cf9md"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.555755 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.556359 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.563167 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.654006 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data-custom\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.654097 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfq4z\" (UniqueName: \"kubernetes.io/projected/aa38e624-6de3-4e41-841e-21109ef1e723-kube-api-access-gfq4z\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.654162 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-combined-ca-bundle\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.654206 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.703136 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c56fd79f-hhg42"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.704840 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.723550 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-545447d6fb-cf9md"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755529 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-combined-ca-bundle\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755589 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755623 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-swift-storage-0\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755651 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-nb\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755667 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-config\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755700 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-sb\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755767 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njz4p\" (UniqueName: \"kubernetes.io/projected/5bc64d2f-e681-426e-a19f-1c0da0764ef7-kube-api-access-njz4p\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755855 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data-custom\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755885 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-svc\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.755922 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfq4z\" (UniqueName: \"kubernetes.io/projected/aa38e624-6de3-4e41-841e-21109ef1e723-kube-api-access-gfq4z\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.763215 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.783088 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-combined-ca-bundle\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.784322 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data-custom\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.818432 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfq4z\" (UniqueName: \"kubernetes.io/projected/aa38e624-6de3-4e41-841e-21109ef1e723-kube-api-access-gfq4z\") pod \"heat-cfnapi-545447d6fb-cf9md\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.841953 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5449db68d9-tvqls"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.847274 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.856819 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njz4p\" (UniqueName: \"kubernetes.io/projected/5bc64d2f-e681-426e-a19f-1c0da0764ef7-kube-api-access-njz4p\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.856861 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zfwx\" (UniqueName: \"kubernetes.io/projected/17c8fbb1-7197-4b91-972f-1d916c09cd19-kube-api-access-9zfwx\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.856896 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data-custom\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.856956 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.856983 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-svc\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.857069 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-swift-storage-0\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.857089 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-nb\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.857106 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-config\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.857134 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-sb\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.857164 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-combined-ca-bundle\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.858277 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-nb\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.858674 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-config\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.865294 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-svc\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.867688 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-swift-storage-0\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.867835 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-sb\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.869029 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.881773 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5449db68d9-tvqls"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.893153 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njz4p\" (UniqueName: \"kubernetes.io/projected/5bc64d2f-e681-426e-a19f-1c0da0764ef7-kube-api-access-njz4p\") pod \"dnsmasq-dns-7c56fd79f-hhg42\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.901138 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.903630 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c56fd79f-hhg42"] Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.959073 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-combined-ca-bundle\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.959195 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zfwx\" (UniqueName: \"kubernetes.io/projected/17c8fbb1-7197-4b91-972f-1d916c09cd19-kube-api-access-9zfwx\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.959245 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data-custom\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.959337 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.967756 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data-custom\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.976015 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-combined-ca-bundle\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:27 crc kubenswrapper[4626]: I0223 07:00:27.981754 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.015332 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zfwx\" (UniqueName: \"kubernetes.io/projected/17c8fbb1-7197-4b91-972f-1d916c09cd19-kube-api-access-9zfwx\") pod \"heat-api-5449db68d9-tvqls\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.035703 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.173526 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.435124 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-655f59485b-t5d4q"] Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.660265 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-545447d6fb-cf9md"] Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.765551 4626 generic.go:334] "Generic (PLEG): container finished" podID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerID="57c40db0aac8894918eac5a5576697f550611b6b700906b05689ccf45e6659fe" exitCode=0 Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.765630 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4","Type":"ContainerDied","Data":"57c40db0aac8894918eac5a5576697f550611b6b700906b05689ccf45e6659fe"} Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.767127 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-545447d6fb-cf9md" event={"ID":"aa38e624-6de3-4e41-841e-21109ef1e723","Type":"ContainerStarted","Data":"dddf5c49ffa58c2a2d0ae92c22c00127bfe6cde4ae5e28e4fa1a0068ac418641"} Feb 23 07:00:28 crc kubenswrapper[4626]: I0223 07:00:28.770157 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-655f59485b-t5d4q" event={"ID":"774d5101-24e2-4871-9a1a-f136698cf092","Type":"ContainerStarted","Data":"0a5a5f0e358fa87b1dfb3b7e05f0b089649bea7a19f9cf36580896124ce250a9"} Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.012328 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.015358 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5449db68d9-tvqls"] Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.066548 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c56fd79f-hhg42"] Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.129472 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-combined-ca-bundle\") pod \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.129825 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-etc-machine-id\") pod \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.129953 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" (UID: "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.130100 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f5fs\" (UniqueName: \"kubernetes.io/projected/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-kube-api-access-8f5fs\") pod \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.130265 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-scripts\") pod \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.130679 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data-custom\") pod \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.130818 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data\") pod \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\" (UID: \"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4\") " Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.131578 4626 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.138321 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-kube-api-access-8f5fs" (OuterVolumeSpecName: "kube-api-access-8f5fs") pod "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" (UID: "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4"). InnerVolumeSpecName "kube-api-access-8f5fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.138419 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-scripts" (OuterVolumeSpecName: "scripts") pod "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" (UID: "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.139984 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" (UID: "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.218541 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" (UID: "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.235222 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.235258 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f5fs\" (UniqueName: \"kubernetes.io/projected/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-kube-api-access-8f5fs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.235272 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.235281 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.281579 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data" (OuterVolumeSpecName: "config-data") pod "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" (UID: "4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.338603 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.789690 4626 generic.go:334] "Generic (PLEG): container finished" podID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerID="9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c" exitCode=0 Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.789978 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" event={"ID":"5bc64d2f-e681-426e-a19f-1c0da0764ef7","Type":"ContainerDied","Data":"9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c"} Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.790010 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" event={"ID":"5bc64d2f-e681-426e-a19f-1c0da0764ef7","Type":"ContainerStarted","Data":"1bda026ac1b2e52d0fdbd92106de0aa7f075469269df238c67e2dbe1f8de9226"} Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.796919 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-655f59485b-t5d4q" event={"ID":"774d5101-24e2-4871-9a1a-f136698cf092","Type":"ContainerStarted","Data":"04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1"} Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.797438 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.803212 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5449db68d9-tvqls" event={"ID":"17c8fbb1-7197-4b91-972f-1d916c09cd19","Type":"ContainerStarted","Data":"67af7f4c95de43f3b18932ed67c1782df7ac9d39c437d55e1a4cff8a83aebbcd"} Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.809321 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4","Type":"ContainerDied","Data":"1e62560da99b919dc301d44628df2b82d5ce22edb4edc48133f1a8949cd82d4b"} Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.809411 4626 scope.go:117] "RemoveContainer" containerID="e986a7e2ce13e9083c49c1ebb06380febd8bf801832f5a1fbba2842bc0dde938" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.809588 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.840642 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-655f59485b-t5d4q" podStartSLOduration=2.840620358 podStartE2EDuration="2.840620358s" podCreationTimestamp="2026-02-23 07:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:29.835347649 +0000 UTC m=+1182.174676905" watchObservedRunningTime="2026-02-23 07:00:29.840620358 +0000 UTC m=+1182.179949624" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.962759 4626 scope.go:117] "RemoveContainer" containerID="57c40db0aac8894918eac5a5576697f550611b6b700906b05689ccf45e6659fe" Feb 23 07:00:29 crc kubenswrapper[4626]: I0223 07:00:29.964180 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.001734 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.013433 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:30 crc kubenswrapper[4626]: E0223 07:00:30.013801 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="probe" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.013822 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="probe" Feb 23 07:00:30 crc kubenswrapper[4626]: E0223 07:00:30.013848 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="cinder-scheduler" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.013854 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="cinder-scheduler" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.014017 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="cinder-scheduler" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.014026 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" containerName="probe" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.014873 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.017083 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.021080 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.050804 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.050872 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6sh\" (UniqueName: \"kubernetes.io/projected/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-kube-api-access-tx6sh\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.050895 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.050963 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.051006 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.051070 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.153420 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.153473 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.153565 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.153607 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.153646 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6sh\" (UniqueName: \"kubernetes.io/projected/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-kube-api-access-tx6sh\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.153667 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.154122 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.158925 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.159851 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.176151 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.177443 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6sh\" (UniqueName: \"kubernetes.io/projected/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-kube-api-access-tx6sh\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.179333 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8\") " pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.344527 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.841229 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.856867 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" event={"ID":"5bc64d2f-e681-426e-a19f-1c0da0764ef7","Type":"ContainerStarted","Data":"071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9"} Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.856976 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:30 crc kubenswrapper[4626]: I0223 07:00:30.881253 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" podStartSLOduration=3.881238434 podStartE2EDuration="3.881238434s" podCreationTimestamp="2026-02-23 07:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:30.880164841 +0000 UTC m=+1183.219494106" watchObservedRunningTime="2026-02-23 07:00:30.881238434 +0000 UTC m=+1183.220567700" Feb 23 07:00:31 crc kubenswrapper[4626]: I0223 07:00:31.655149 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-755f5b5889-45mmc" Feb 23 07:00:31 crc kubenswrapper[4626]: I0223 07:00:31.747449 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-668d74f4c6-dk9gw"] Feb 23 07:00:31 crc kubenswrapper[4626]: I0223 07:00:31.750617 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-668d74f4c6-dk9gw" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-api" containerID="cri-o://6b4f7424acae7dc62759458d73b53320f54d2f8667120a35c994c37d533f5e9b" gracePeriod=30 Feb 23 07:00:31 crc kubenswrapper[4626]: I0223 07:00:31.751752 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-668d74f4c6-dk9gw" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-httpd" containerID="cri-o://f789f38e1ff4cfc9f3aea55c1b7d7f406c1e5477a4a6ba46ca01c341a465e91c" gracePeriod=30 Feb 23 07:00:31 crc kubenswrapper[4626]: I0223 07:00:31.873185 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8","Type":"ContainerStarted","Data":"dc18151cae12117a663ba5b43e0db34da19815aef51a8ddf835cb846aad17691"} Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.011273 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4" path="/var/lib/kubelet/pods/4a5d4b2b-46ce-4bef-a8f6-4b877b97b1a4/volumes" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.277919 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6465458495-hgsdz"] Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.280119 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.284228 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.284479 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.285237 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.321383 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6465458495-hgsdz"] Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.421529 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/434e199c-4e18-4274-bbaa-f81f2e2a697b-etc-swift\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.421583 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgnc6\" (UniqueName: \"kubernetes.io/projected/434e199c-4e18-4274-bbaa-f81f2e2a697b-kube-api-access-qgnc6\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.422062 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-internal-tls-certs\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.422110 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-public-tls-certs\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.422177 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/434e199c-4e18-4274-bbaa-f81f2e2a697b-run-httpd\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.422245 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-config-data\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.422286 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/434e199c-4e18-4274-bbaa-f81f2e2a697b-log-httpd\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.422310 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-combined-ca-bundle\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527592 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-config-data\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527657 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/434e199c-4e18-4274-bbaa-f81f2e2a697b-log-httpd\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527686 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-combined-ca-bundle\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527727 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/434e199c-4e18-4274-bbaa-f81f2e2a697b-etc-swift\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527754 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgnc6\" (UniqueName: \"kubernetes.io/projected/434e199c-4e18-4274-bbaa-f81f2e2a697b-kube-api-access-qgnc6\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527792 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-internal-tls-certs\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527826 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-public-tls-certs\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.527883 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/434e199c-4e18-4274-bbaa-f81f2e2a697b-run-httpd\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.528208 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/434e199c-4e18-4274-bbaa-f81f2e2a697b-run-httpd\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.531635 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-config-data\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.531878 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/434e199c-4e18-4274-bbaa-f81f2e2a697b-log-httpd\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.537963 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-combined-ca-bundle\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.538928 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/434e199c-4e18-4274-bbaa-f81f2e2a697b-etc-swift\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.538975 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-internal-tls-certs\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.554198 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgnc6\" (UniqueName: \"kubernetes.io/projected/434e199c-4e18-4274-bbaa-f81f2e2a697b-kube-api-access-qgnc6\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.555600 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/434e199c-4e18-4274-bbaa-f81f2e2a697b-public-tls-certs\") pod \"swift-proxy-6465458495-hgsdz\" (UID: \"434e199c-4e18-4274-bbaa-f81f2e2a697b\") " pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.612692 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.922236 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-545447d6fb-cf9md" event={"ID":"aa38e624-6de3-4e41-841e-21109ef1e723","Type":"ContainerStarted","Data":"e8e25f190c8190eb8bc86c3f0e7e205137cb11dfd1a38de4f415f99e17725246"} Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.923472 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.935637 4626 generic.go:334] "Generic (PLEG): container finished" podID="d0643197-bfab-42f2-bfef-85ab66daf967" containerID="f789f38e1ff4cfc9f3aea55c1b7d7f406c1e5477a4a6ba46ca01c341a465e91c" exitCode=0 Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.936100 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668d74f4c6-dk9gw" event={"ID":"d0643197-bfab-42f2-bfef-85ab66daf967","Type":"ContainerDied","Data":"f789f38e1ff4cfc9f3aea55c1b7d7f406c1e5477a4a6ba46ca01c341a465e91c"} Feb 23 07:00:32 crc kubenswrapper[4626]: I0223 07:00:32.955992 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-545447d6fb-cf9md" podStartSLOduration=2.19888944 podStartE2EDuration="5.955973102s" podCreationTimestamp="2026-02-23 07:00:27 +0000 UTC" firstStartedPulling="2026-02-23 07:00:28.651638747 +0000 UTC m=+1180.990968013" lastFinishedPulling="2026-02-23 07:00:32.408722409 +0000 UTC m=+1184.748051675" observedRunningTime="2026-02-23 07:00:32.942951667 +0000 UTC m=+1185.282280933" watchObservedRunningTime="2026-02-23 07:00:32.955973102 +0000 UTC m=+1185.295302368" Feb 23 07:00:33 crc kubenswrapper[4626]: I0223 07:00:33.430794 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6465458495-hgsdz"] Feb 23 07:00:33 crc kubenswrapper[4626]: I0223 07:00:33.635582 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 23 07:00:33 crc kubenswrapper[4626]: I0223 07:00:33.980760 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8","Type":"ContainerStarted","Data":"9a55ec6c5a18cae541fb8e7d0bb1e8f0ac5b9e8741347e48973a108fc0325efc"} Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.057669 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5449db68d9-tvqls" event={"ID":"17c8fbb1-7197-4b91-972f-1d916c09cd19","Type":"ContainerStarted","Data":"70429a5ef50a3b0f55f67ed502e003b523eadaa7b8c2bf830093eae084a395a6"} Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.058435 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.071078 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6465458495-hgsdz" event={"ID":"434e199c-4e18-4274-bbaa-f81f2e2a697b","Type":"ContainerStarted","Data":"4499384de2150e41ade55ea6102fd63e1c51a0cf602c7c6aa72854042a219adf"} Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.075773 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5449db68d9-tvqls" podStartSLOduration=3.638010601 podStartE2EDuration="7.075756945s" podCreationTimestamp="2026-02-23 07:00:27 +0000 UTC" firstStartedPulling="2026-02-23 07:00:29.041094929 +0000 UTC m=+1181.380424195" lastFinishedPulling="2026-02-23 07:00:32.478841273 +0000 UTC m=+1184.818170539" observedRunningTime="2026-02-23 07:00:34.074084532 +0000 UTC m=+1186.413413789" watchObservedRunningTime="2026-02-23 07:00:34.075756945 +0000 UTC m=+1186.415086211" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.129244 4626 generic.go:334] "Generic (PLEG): container finished" podID="d0643197-bfab-42f2-bfef-85ab66daf967" containerID="6b4f7424acae7dc62759458d73b53320f54d2f8667120a35c994c37d533f5e9b" exitCode=0 Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.129459 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668d74f4c6-dk9gw" event={"ID":"d0643197-bfab-42f2-bfef-85ab66daf967","Type":"ContainerDied","Data":"6b4f7424acae7dc62759458d73b53320f54d2f8667120a35c994c37d533f5e9b"} Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.158081 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.316807 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-combined-ca-bundle\") pod \"d0643197-bfab-42f2-bfef-85ab66daf967\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.317016 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-ovndb-tls-certs\") pod \"d0643197-bfab-42f2-bfef-85ab66daf967\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.317064 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-httpd-config\") pod \"d0643197-bfab-42f2-bfef-85ab66daf967\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.317308 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-config\") pod \"d0643197-bfab-42f2-bfef-85ab66daf967\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.317396 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82v58\" (UniqueName: \"kubernetes.io/projected/d0643197-bfab-42f2-bfef-85ab66daf967-kube-api-access-82v58\") pod \"d0643197-bfab-42f2-bfef-85ab66daf967\" (UID: \"d0643197-bfab-42f2-bfef-85ab66daf967\") " Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.341966 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0643197-bfab-42f2-bfef-85ab66daf967-kube-api-access-82v58" (OuterVolumeSpecName: "kube-api-access-82v58") pod "d0643197-bfab-42f2-bfef-85ab66daf967" (UID: "d0643197-bfab-42f2-bfef-85ab66daf967"). InnerVolumeSpecName "kube-api-access-82v58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.347604 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d0643197-bfab-42f2-bfef-85ab66daf967" (UID: "d0643197-bfab-42f2-bfef-85ab66daf967"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.400655 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5d7b45f997-g8dhd"] Feb 23 07:00:34 crc kubenswrapper[4626]: E0223 07:00:34.401627 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-api" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.401648 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-api" Feb 23 07:00:34 crc kubenswrapper[4626]: E0223 07:00:34.401704 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-httpd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.401717 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-httpd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.402037 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-api" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.402061 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" containerName="neutron-httpd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.403193 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.441616 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82v58\" (UniqueName: \"kubernetes.io/projected/d0643197-bfab-42f2-bfef-85ab66daf967-kube-api-access-82v58\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.441681 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.453047 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d7b45f997-g8dhd"] Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.521918 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-65db596798-kdkj2"] Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.523752 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-config" (OuterVolumeSpecName: "config") pod "d0643197-bfab-42f2-bfef-85ab66daf967" (UID: "d0643197-bfab-42f2-bfef-85ab66daf967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.524196 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.559562 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-675458dc64-5m4q9"] Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.562271 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565536 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7wmw\" (UniqueName: \"kubernetes.io/projected/e53266a3-8d3d-44af-b0f7-c48a7170ceac-kube-api-access-f7wmw\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565605 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565653 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-config-data-custom\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565700 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjx8\" (UniqueName: \"kubernetes.io/projected/1c0ed8c1-9098-4598-8390-ed8c709fa057-kube-api-access-pbjx8\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565760 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-combined-ca-bundle\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565844 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-combined-ca-bundle\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565887 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-config-data\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.565932 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data-custom\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.566045 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.585926 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-675458dc64-5m4q9"] Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.605264 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65db596798-kdkj2"] Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.626923 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0643197-bfab-42f2-bfef-85ab66daf967" (UID: "d0643197-bfab-42f2-bfef-85ab66daf967"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.668754 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.668900 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7wmw\" (UniqueName: \"kubernetes.io/projected/e53266a3-8d3d-44af-b0f7-c48a7170ceac-kube-api-access-f7wmw\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.668940 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.668984 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-config-data-custom\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669019 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjx8\" (UniqueName: \"kubernetes.io/projected/1c0ed8c1-9098-4598-8390-ed8c709fa057-kube-api-access-pbjx8\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669066 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-combined-ca-bundle\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669122 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data-custom\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669148 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-combined-ca-bundle\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669189 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-config-data\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669236 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data-custom\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669309 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-combined-ca-bundle\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.669353 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xw4k\" (UniqueName: \"kubernetes.io/projected/571961b2-ee38-4d10-a958-4157c3624ec2-kube-api-access-9xw4k\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.671721 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.682407 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data-custom\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.686888 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.687007 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-combined-ca-bundle\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.687738 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-combined-ca-bundle\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.691012 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-config-data\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.691640 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjx8\" (UniqueName: \"kubernetes.io/projected/1c0ed8c1-9098-4598-8390-ed8c709fa057-kube-api-access-pbjx8\") pod \"heat-cfnapi-65db596798-kdkj2\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.700375 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7wmw\" (UniqueName: \"kubernetes.io/projected/e53266a3-8d3d-44af-b0f7-c48a7170ceac-kube-api-access-f7wmw\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.706145 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53266a3-8d3d-44af-b0f7-c48a7170ceac-config-data-custom\") pod \"heat-engine-5d7b45f997-g8dhd\" (UID: \"e53266a3-8d3d-44af-b0f7-c48a7170ceac\") " pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.716729 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.717137 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d0643197-bfab-42f2-bfef-85ab66daf967" (UID: "d0643197-bfab-42f2-bfef-85ab66daf967"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.775286 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-combined-ca-bundle\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.775333 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xw4k\" (UniqueName: \"kubernetes.io/projected/571961b2-ee38-4d10-a958-4157c3624ec2-kube-api-access-9xw4k\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.775400 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.775561 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data-custom\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.779297 4626 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0643197-bfab-42f2-bfef-85ab66daf967-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.785481 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-combined-ca-bundle\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.785604 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data-custom\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.787367 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.798839 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xw4k\" (UniqueName: \"kubernetes.io/projected/571961b2-ee38-4d10-a958-4157c3624ec2-kube-api-access-9xw4k\") pod \"heat-api-675458dc64-5m4q9\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:34 crc kubenswrapper[4626]: I0223 07:00:34.966289 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.027529 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.215035 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668d74f4c6-dk9gw" event={"ID":"d0643197-bfab-42f2-bfef-85ab66daf967","Type":"ContainerDied","Data":"fd27815428914fc07ee97e6c6fae7f44495c86ef62b2464a2733527953eee774"} Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.215321 4626 scope.go:117] "RemoveContainer" containerID="f789f38e1ff4cfc9f3aea55c1b7d7f406c1e5477a4a6ba46ca01c341a465e91c" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.215519 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-668d74f4c6-dk9gw" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.237330 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8","Type":"ContainerStarted","Data":"400b6c44e6c3520942ccb1e0de0581a5351da7f6d4a842a896c3ea4c8a3700ff"} Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.249916 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6465458495-hgsdz" event={"ID":"434e199c-4e18-4274-bbaa-f81f2e2a697b","Type":"ContainerStarted","Data":"3183ba3d2316ea09c02b7d6b963144284d9f0a233b9ccac2e7f12468036f44f8"} Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.249951 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6465458495-hgsdz" event={"ID":"434e199c-4e18-4274-bbaa-f81f2e2a697b","Type":"ContainerStarted","Data":"fda58e54404be8057fcb759a10d25b15a437202eb0044326a4fe78f486a9350f"} Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.249966 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.249990 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.269595 4626 scope.go:117] "RemoveContainer" containerID="6b4f7424acae7dc62759458d73b53320f54d2f8667120a35c994c37d533f5e9b" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.294702 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.294686567 podStartE2EDuration="6.294686567s" podCreationTimestamp="2026-02-23 07:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:35.262792307 +0000 UTC m=+1187.602121573" watchObservedRunningTime="2026-02-23 07:00:35.294686567 +0000 UTC m=+1187.634015833" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.342633 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-668d74f4c6-dk9gw"] Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.344719 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.376140 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-668d74f4c6-dk9gw"] Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.379551 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6465458495-hgsdz" podStartSLOduration=3.379527522 podStartE2EDuration="3.379527522s" podCreationTimestamp="2026-02-23 07:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:35.335795027 +0000 UTC m=+1187.675124293" watchObservedRunningTime="2026-02-23 07:00:35.379527522 +0000 UTC m=+1187.718856788" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.408821 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65db596798-kdkj2"] Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.763669 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.764375 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-central-agent" containerID="cri-o://ef878a1d1d88bf9bd3e350eeb007136b49032f622448769c43e8400b5a7417b4" gracePeriod=30 Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.765151 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="proxy-httpd" containerID="cri-o://38568a9892113266e45e1d53968595da05aefc6cde23102e7c8d0ade9d7d77f2" gracePeriod=30 Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.765217 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="sg-core" containerID="cri-o://e2462c86e98b963cae55f557dabd96a717fe6c2639e81395164dba612ac53c59" gracePeriod=30 Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.765257 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-notification-agent" containerID="cri-o://4be2ac2b8a0146115cfafe4cf8e049c760b81916f143932060e48d95711b3bb8" gracePeriod=30 Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.813827 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d7b45f997-g8dhd"] Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.875060 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.171:3000/\": read tcp 10.217.0.2:50272->10.217.0.171:3000: read: connection reset by peer" Feb 23 07:00:35 crc kubenswrapper[4626]: I0223 07:00:35.928138 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-675458dc64-5m4q9"] Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.008768 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0643197-bfab-42f2-bfef-85ab66daf967" path="/var/lib/kubelet/pods/d0643197-bfab-42f2-bfef-85ab66daf967/volumes" Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.260104 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65db596798-kdkj2" event={"ID":"1c0ed8c1-9098-4598-8390-ed8c709fa057","Type":"ContainerDied","Data":"fba9e613be6f7ece502731c1224c36e76395ba171617ceaff5518149a4db23c5"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.260068 4626 generic.go:334] "Generic (PLEG): container finished" podID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerID="fba9e613be6f7ece502731c1224c36e76395ba171617ceaff5518149a4db23c5" exitCode=1 Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.260599 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65db596798-kdkj2" event={"ID":"1c0ed8c1-9098-4598-8390-ed8c709fa057","Type":"ContainerStarted","Data":"ac74afb29eb7d4abd6c00e633ee31340c125187991368bf65861950ed9709b3a"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.260980 4626 scope.go:117] "RemoveContainer" containerID="fba9e613be6f7ece502731c1224c36e76395ba171617ceaff5518149a4db23c5" Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.286354 4626 generic.go:334] "Generic (PLEG): container finished" podID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerID="38568a9892113266e45e1d53968595da05aefc6cde23102e7c8d0ade9d7d77f2" exitCode=0 Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.286384 4626 generic.go:334] "Generic (PLEG): container finished" podID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerID="e2462c86e98b963cae55f557dabd96a717fe6c2639e81395164dba612ac53c59" exitCode=2 Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.286394 4626 generic.go:334] "Generic (PLEG): container finished" podID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerID="ef878a1d1d88bf9bd3e350eeb007136b49032f622448769c43e8400b5a7417b4" exitCode=0 Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.286443 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerDied","Data":"38568a9892113266e45e1d53968595da05aefc6cde23102e7c8d0ade9d7d77f2"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.286472 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerDied","Data":"e2462c86e98b963cae55f557dabd96a717fe6c2639e81395164dba612ac53c59"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.286483 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerDied","Data":"ef878a1d1d88bf9bd3e350eeb007136b49032f622448769c43e8400b5a7417b4"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.288209 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-675458dc64-5m4q9" event={"ID":"571961b2-ee38-4d10-a958-4157c3624ec2","Type":"ContainerStarted","Data":"d7a3e4cad0b9895b32f875d1f1dd79c8035be033f2556aeb56e4df3ee1081311"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.288238 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-675458dc64-5m4q9" event={"ID":"571961b2-ee38-4d10-a958-4157c3624ec2","Type":"ContainerStarted","Data":"b7398a6e49aa4cf2a4181147d3522055a0deef3c5e0ab215fe39a3a99346c830"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.289545 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.304354 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d7b45f997-g8dhd" event={"ID":"e53266a3-8d3d-44af-b0f7-c48a7170ceac","Type":"ContainerStarted","Data":"300bf0dcb06b3edd10ede2cdc4a83439e0bda9f2cd93d23e3378a05bc1f1c1ef"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.304402 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d7b45f997-g8dhd" event={"ID":"e53266a3-8d3d-44af-b0f7-c48a7170ceac","Type":"ContainerStarted","Data":"452b8186a84567bdb2617ee228e2c520c0417a4d25e864d50937a250a40738c5"} Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.304638 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.385948 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-675458dc64-5m4q9" podStartSLOduration=2.385920245 podStartE2EDuration="2.385920245s" podCreationTimestamp="2026-02-23 07:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:36.304465182 +0000 UTC m=+1188.643794448" watchObservedRunningTime="2026-02-23 07:00:36.385920245 +0000 UTC m=+1188.725249530" Feb 23 07:00:36 crc kubenswrapper[4626]: I0223 07:00:36.397065 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5d7b45f997-g8dhd" podStartSLOduration=2.39704807 podStartE2EDuration="2.39704807s" podCreationTimestamp="2026-02-23 07:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:36.335850373 +0000 UTC m=+1188.675179639" watchObservedRunningTime="2026-02-23 07:00:36.39704807 +0000 UTC m=+1188.736377336" Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.343189 4626 generic.go:334] "Generic (PLEG): container finished" podID="571961b2-ee38-4d10-a958-4157c3624ec2" containerID="d7a3e4cad0b9895b32f875d1f1dd79c8035be033f2556aeb56e4df3ee1081311" exitCode=1 Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.344611 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-675458dc64-5m4q9" event={"ID":"571961b2-ee38-4d10-a958-4157c3624ec2","Type":"ContainerDied","Data":"d7a3e4cad0b9895b32f875d1f1dd79c8035be033f2556aeb56e4df3ee1081311"} Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.347058 4626 scope.go:117] "RemoveContainer" containerID="d7a3e4cad0b9895b32f875d1f1dd79c8035be033f2556aeb56e4df3ee1081311" Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.357013 4626 generic.go:334] "Generic (PLEG): container finished" podID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerID="7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28" exitCode=1 Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.359399 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65db596798-kdkj2" event={"ID":"1c0ed8c1-9098-4598-8390-ed8c709fa057","Type":"ContainerDied","Data":"7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28"} Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.359452 4626 scope.go:117] "RemoveContainer" containerID="fba9e613be6f7ece502731c1224c36e76395ba171617ceaff5518149a4db23c5" Feb 23 07:00:37 crc kubenswrapper[4626]: I0223 07:00:37.359879 4626 scope.go:117] "RemoveContainer" containerID="7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28" Feb 23 07:00:37 crc kubenswrapper[4626]: E0223 07:00:37.360071 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-65db596798-kdkj2_openstack(1c0ed8c1-9098-4598-8390-ed8c709fa057)\"" pod="openstack/heat-cfnapi-65db596798-kdkj2" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.038642 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.128059 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855f884985-s9fqw"] Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.128262 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-855f884985-s9fqw" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerName="dnsmasq-dns" containerID="cri-o://2088a08dd29b6e6be47be4d391ea0ef87aa51a3b896e88596e2767a0074ae42b" gracePeriod=10 Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.387299 4626 generic.go:334] "Generic (PLEG): container finished" podID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerID="2088a08dd29b6e6be47be4d391ea0ef87aa51a3b896e88596e2767a0074ae42b" exitCode=0 Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.387614 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855f884985-s9fqw" event={"ID":"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7","Type":"ContainerDied","Data":"2088a08dd29b6e6be47be4d391ea0ef87aa51a3b896e88596e2767a0074ae42b"} Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.393798 4626 scope.go:117] "RemoveContainer" containerID="7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28" Feb 23 07:00:38 crc kubenswrapper[4626]: E0223 07:00:38.394145 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-65db596798-kdkj2_openstack(1c0ed8c1-9098-4598-8390-ed8c709fa057)\"" pod="openstack/heat-cfnapi-65db596798-kdkj2" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.422864 4626 generic.go:334] "Generic (PLEG): container finished" podID="571961b2-ee38-4d10-a958-4157c3624ec2" containerID="8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9" exitCode=1 Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.422904 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-675458dc64-5m4q9" event={"ID":"571961b2-ee38-4d10-a958-4157c3624ec2","Type":"ContainerDied","Data":"8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9"} Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.422940 4626 scope.go:117] "RemoveContainer" containerID="d7a3e4cad0b9895b32f875d1f1dd79c8035be033f2556aeb56e4df3ee1081311" Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.423634 4626 scope.go:117] "RemoveContainer" containerID="8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9" Feb 23 07:00:38 crc kubenswrapper[4626]: E0223 07:00:38.424008 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-675458dc64-5m4q9_openstack(571961b2-ee38-4d10-a958-4157c3624ec2)\"" pod="openstack/heat-api-675458dc64-5m4q9" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" Feb 23 07:00:38 crc kubenswrapper[4626]: I0223 07:00:38.872362 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.068087 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-sb\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.068443 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-swift-storage-0\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.068469 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-nb\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.068604 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.068704 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd5fp\" (UniqueName: \"kubernetes.io/projected/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-kube-api-access-zd5fp\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.068797 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-config\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.094800 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-kube-api-access-zd5fp" (OuterVolumeSpecName: "kube-api-access-zd5fp") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7"). InnerVolumeSpecName "kube-api-access-zd5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.178893 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.180809 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd5fp\" (UniqueName: \"kubernetes.io/projected/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-kube-api-access-zd5fp\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.180975 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.194212 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-545447d6fb-cf9md"] Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.194438 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-545447d6fb-cf9md" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" containerID="cri-o://e8e25f190c8190eb8bc86c3f0e7e205137cb11dfd1a38de4f415f99e17725246" gracePeriod=60 Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.227009 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-545447d6fb-cf9md" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.174:8000/healthcheck\": EOF" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.232589 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-545447d6fb-cf9md" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.174:8000/healthcheck\": EOF" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.234835 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5449db68d9-tvqls"] Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.235057 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5449db68d9-tvqls" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" containerID="cri-o://70429a5ef50a3b0f55f67ed502e003b523eadaa7b8c2bf830093eae084a395a6" gracePeriod=60 Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.238027 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.250786 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-5449db68d9-tvqls" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.176:8004/healthcheck\": EOF" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.250969 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5449db68d9-tvqls" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.176:8004/healthcheck\": EOF" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.256972 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:39 crc kubenswrapper[4626]: E0223 07:00:39.275772 4626 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc podName:167dcf2c-2be6-4e2a-a331-1d7ca5034ba7 nodeName:}" failed. No retries permitted until 2026-02-23 07:00:39.775742076 +0000 UTC m=+1192.115071341 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7") : error deleting /var/lib/kubelet/pods/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7/volume-subpaths: remove /var/lib/kubelet/pods/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7/volume-subpaths: no such file or directory Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.276023 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-config" (OuterVolumeSpecName: "config") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.282515 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.282541 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.282551 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.283274 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-54768cd758-d6bbc"] Feb 23 07:00:39 crc kubenswrapper[4626]: E0223 07:00:39.283658 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerName="dnsmasq-dns" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.283676 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerName="dnsmasq-dns" Feb 23 07:00:39 crc kubenswrapper[4626]: E0223 07:00:39.283701 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerName="init" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.283707 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerName="init" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.283911 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" containerName="dnsmasq-dns" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.284580 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.305596 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.305778 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.328631 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7f6ddcf74f-chh24"] Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.330158 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.333205 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.333342 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.363776 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f6ddcf74f-chh24"] Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.375666 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54768cd758-d6bbc"] Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.385654 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-config-data\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.385757 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmw5\" (UniqueName: \"kubernetes.io/projected/bdb47037-af0c-4d21-9e61-53b65fb113d1-kube-api-access-bsmw5\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.385871 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-public-tls-certs\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.385943 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-internal-tls-certs\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386017 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-internal-tls-certs\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386131 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d794v\" (UniqueName: \"kubernetes.io/projected/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-kube-api-access-d794v\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386250 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-config-data\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386341 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-combined-ca-bundle\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386415 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-config-data-custom\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386520 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-combined-ca-bundle\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386730 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-config-data-custom\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.386874 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-public-tls-certs\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.469234 4626 scope.go:117] "RemoveContainer" containerID="8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9" Feb 23 07:00:39 crc kubenswrapper[4626]: E0223 07:00:39.469491 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-675458dc64-5m4q9_openstack(571961b2-ee38-4d10-a958-4157c3624ec2)\"" pod="openstack/heat-api-675458dc64-5m4q9" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.476172 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855f884985-s9fqw" event={"ID":"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7","Type":"ContainerDied","Data":"a5385bc3d17051e665df6c43a7859dbd071711e927594df673e814543c92b35e"} Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.476210 4626 scope.go:117] "RemoveContainer" containerID="2088a08dd29b6e6be47be4d391ea0ef87aa51a3b896e88596e2767a0074ae42b" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.476318 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855f884985-s9fqw" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488624 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-config-data\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488663 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmw5\" (UniqueName: \"kubernetes.io/projected/bdb47037-af0c-4d21-9e61-53b65fb113d1-kube-api-access-bsmw5\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488700 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-public-tls-certs\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488721 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-internal-tls-certs\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488748 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-internal-tls-certs\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488805 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d794v\" (UniqueName: \"kubernetes.io/projected/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-kube-api-access-d794v\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488887 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-config-data\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488932 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-combined-ca-bundle\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.488957 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-config-data-custom\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.489015 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-combined-ca-bundle\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.489040 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-config-data-custom\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.489075 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-public-tls-certs\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.497306 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-config-data\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.498017 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-internal-tls-certs\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.498661 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-config-data\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.499386 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-config-data-custom\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.500099 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-config-data-custom\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.500305 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-public-tls-certs\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.501122 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-combined-ca-bundle\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.505034 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-internal-tls-certs\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.508649 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-public-tls-certs\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.513187 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb47037-af0c-4d21-9e61-53b65fb113d1-combined-ca-bundle\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.526428 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d794v\" (UniqueName: \"kubernetes.io/projected/05f9d0eb-fafc-496f-9fe2-8923f9d8db61-kube-api-access-d794v\") pod \"heat-cfnapi-7f6ddcf74f-chh24\" (UID: \"05f9d0eb-fafc-496f-9fe2-8923f9d8db61\") " pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.531656 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmw5\" (UniqueName: \"kubernetes.io/projected/bdb47037-af0c-4d21-9e61-53b65fb113d1-kube-api-access-bsmw5\") pod \"heat-api-54768cd758-d6bbc\" (UID: \"bdb47037-af0c-4d21-9e61-53b65fb113d1\") " pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.625636 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.655325 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.717545 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.718042 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.718720 4626 scope.go:117] "RemoveContainer" containerID="7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28" Feb 23 07:00:39 crc kubenswrapper[4626]: E0223 07:00:39.719031 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-65db596798-kdkj2_openstack(1c0ed8c1-9098-4598-8390-ed8c709fa057)\"" pod="openstack/heat-cfnapi-65db596798-kdkj2" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.803399 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc\") pod \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\" (UID: \"167dcf2c-2be6-4e2a-a331-1d7ca5034ba7\") " Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.803917 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" (UID: "167dcf2c-2be6-4e2a-a331-1d7ca5034ba7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:39 crc kubenswrapper[4626]: I0223 07:00:39.808834 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.028216 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.028291 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.096228 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855f884985-s9fqw"] Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.103623 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855f884985-s9fqw"] Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.501566 4626 generic.go:334] "Generic (PLEG): container finished" podID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerID="4be2ac2b8a0146115cfafe4cf8e049c760b81916f143932060e48d95711b3bb8" exitCode=0 Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.502622 4626 scope.go:117] "RemoveContainer" containerID="8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9" Feb 23 07:00:40 crc kubenswrapper[4626]: E0223 07:00:40.503044 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-675458dc64-5m4q9_openstack(571961b2-ee38-4d10-a958-4157c3624ec2)\"" pod="openstack/heat-api-675458dc64-5m4q9" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.503402 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerDied","Data":"4be2ac2b8a0146115cfafe4cf8e049c760b81916f143932060e48d95711b3bb8"} Feb 23 07:00:40 crc kubenswrapper[4626]: I0223 07:00:40.700315 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 07:00:41 crc kubenswrapper[4626]: I0223 07:00:41.991218 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="167dcf2c-2be6-4e2a-a331-1d7ca5034ba7" path="/var/lib/kubelet/pods/167dcf2c-2be6-4e2a-a331-1d7ca5034ba7/volumes" Feb 23 07:00:42 crc kubenswrapper[4626]: I0223 07:00:42.621259 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:42 crc kubenswrapper[4626]: I0223 07:00:42.623386 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6465458495-hgsdz" Feb 23 07:00:43 crc kubenswrapper[4626]: I0223 07:00:43.635821 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9c5c7b856-snkxr" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 23 07:00:43 crc kubenswrapper[4626]: I0223 07:00:43.637164 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.322775 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.323356 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-log" containerID="cri-o://13b00ec4e33abde6af8fa25378a16255506bb30adff2ddb2d6b3866f55040d08" gracePeriod=30 Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.323856 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-httpd" containerID="cri-o://d8c9929a3dcc07967d2a2cb42b1477ead3837767b756bbeab0bfbc273efc72a5" gracePeriod=30 Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.550082 4626 generic.go:334] "Generic (PLEG): container finished" podID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerID="13b00ec4e33abde6af8fa25378a16255506bb30adff2ddb2d6b3866f55040d08" exitCode=143 Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.550155 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58f4f2ad-1b9a-4be4-a535-32183ce254a6","Type":"ContainerDied","Data":"13b00ec4e33abde6af8fa25378a16255506bb30adff2ddb2d6b3866f55040d08"} Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.661760 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5449db68d9-tvqls" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.176:8004/healthcheck\": read tcp 10.217.0.2:60478->10.217.0.176:8004: read: connection reset by peer" Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.662206 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5449db68d9-tvqls" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.176:8004/healthcheck\": dial tcp 10.217.0.176:8004: connect: connection refused" Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.671859 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-545447d6fb-cf9md" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.174:8000/healthcheck\": read tcp 10.217.0.2:37698->10.217.0.174:8000: read: connection reset by peer" Feb 23 07:00:44 crc kubenswrapper[4626]: I0223 07:00:44.672288 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-545447d6fb-cf9md" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.174:8000/healthcheck\": dial tcp 10.217.0.174:8000: connect: connection refused" Feb 23 07:00:45 crc kubenswrapper[4626]: I0223 07:00:45.562657 4626 generic.go:334] "Generic (PLEG): container finished" podID="f0006614-96ac-4260-a959-a66db83df548" containerID="0f7e9544924a8de544e50ff8752bbcc36e4412b8d98d51da038f2153fa68a305" exitCode=137 Feb 23 07:00:45 crc kubenswrapper[4626]: I0223 07:00:45.562745 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0006614-96ac-4260-a959-a66db83df548","Type":"ContainerDied","Data":"0f7e9544924a8de544e50ff8752bbcc36e4412b8d98d51da038f2153fa68a305"} Feb 23 07:00:45 crc kubenswrapper[4626]: I0223 07:00:45.565482 4626 generic.go:334] "Generic (PLEG): container finished" podID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerID="70429a5ef50a3b0f55f67ed502e003b523eadaa7b8c2bf830093eae084a395a6" exitCode=0 Feb 23 07:00:45 crc kubenswrapper[4626]: I0223 07:00:45.565533 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5449db68d9-tvqls" event={"ID":"17c8fbb1-7197-4b91-972f-1d916c09cd19","Type":"ContainerDied","Data":"70429a5ef50a3b0f55f67ed502e003b523eadaa7b8c2bf830093eae084a395a6"} Feb 23 07:00:45 crc kubenswrapper[4626]: I0223 07:00:45.567779 4626 generic.go:334] "Generic (PLEG): container finished" podID="aa38e624-6de3-4e41-841e-21109ef1e723" containerID="e8e25f190c8190eb8bc86c3f0e7e205137cb11dfd1a38de4f415f99e17725246" exitCode=0 Feb 23 07:00:45 crc kubenswrapper[4626]: I0223 07:00:45.567891 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-545447d6fb-cf9md" event={"ID":"aa38e624-6de3-4e41-841e-21109ef1e723","Type":"ContainerDied","Data":"e8e25f190c8190eb8bc86c3f0e7e205137cb11dfd1a38de4f415f99e17725246"} Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.428740 4626 scope.go:117] "RemoveContainer" containerID="9c4b8ccbf45740b21ab8ccd4f79dd6a7a91455b87dab4dddeeea90395c637192" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.696777 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.802099 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data\") pod \"aa38e624-6de3-4e41-841e-21109ef1e723\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.802184 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data-custom\") pod \"aa38e624-6de3-4e41-841e-21109ef1e723\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.802291 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfq4z\" (UniqueName: \"kubernetes.io/projected/aa38e624-6de3-4e41-841e-21109ef1e723-kube-api-access-gfq4z\") pod \"aa38e624-6de3-4e41-841e-21109ef1e723\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.802399 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-combined-ca-bundle\") pod \"aa38e624-6de3-4e41-841e-21109ef1e723\" (UID: \"aa38e624-6de3-4e41-841e-21109ef1e723\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.815371 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa38e624-6de3-4e41-841e-21109ef1e723-kube-api-access-gfq4z" (OuterVolumeSpecName: "kube-api-access-gfq4z") pod "aa38e624-6de3-4e41-841e-21109ef1e723" (UID: "aa38e624-6de3-4e41-841e-21109ef1e723"). InnerVolumeSpecName "kube-api-access-gfq4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.823104 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa38e624-6de3-4e41-841e-21109ef1e723" (UID: "aa38e624-6de3-4e41-841e-21109ef1e723"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.843774 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa38e624-6de3-4e41-841e-21109ef1e723" (UID: "aa38e624-6de3-4e41-841e-21109ef1e723"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.868872 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.885997 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data" (OuterVolumeSpecName: "config-data") pod "aa38e624-6de3-4e41-841e-21109ef1e723" (UID: "aa38e624-6de3-4e41-841e-21109ef1e723"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905197 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd2nz\" (UniqueName: \"kubernetes.io/projected/f0006614-96ac-4260-a959-a66db83df548-kube-api-access-bd2nz\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905233 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-combined-ca-bundle\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905254 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0006614-96ac-4260-a959-a66db83df548-logs\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905389 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0006614-96ac-4260-a959-a66db83df548-etc-machine-id\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905453 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data-custom\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905631 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-scripts\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.905662 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data\") pod \"f0006614-96ac-4260-a959-a66db83df548\" (UID: \"f0006614-96ac-4260-a959-a66db83df548\") " Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.906361 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.906373 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.906382 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfq4z\" (UniqueName: \"kubernetes.io/projected/aa38e624-6de3-4e41-841e-21109ef1e723-kube-api-access-gfq4z\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.906390 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa38e624-6de3-4e41-841e-21109ef1e723-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.906728 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f0006614-96ac-4260-a959-a66db83df548-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.907583 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0006614-96ac-4260-a959-a66db83df548-logs" (OuterVolumeSpecName: "logs") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.911807 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0006614-96ac-4260-a959-a66db83df548-kube-api-access-bd2nz" (OuterVolumeSpecName: "kube-api-access-bd2nz") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "kube-api-access-bd2nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.912324 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-scripts" (OuterVolumeSpecName: "scripts") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.928705 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.950750 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.958047 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.961675 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:46 crc kubenswrapper[4626]: I0223 07:00:46.994575 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data" (OuterVolumeSpecName: "config-data") pod "f0006614-96ac-4260-a959-a66db83df548" (UID: "f0006614-96ac-4260-a959-a66db83df548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.007558 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-scripts\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.007729 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-run-httpd\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.007838 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-combined-ca-bundle\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.007908 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data-custom\") pod \"17c8fbb1-7197-4b91-972f-1d916c09cd19\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008000 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prg8\" (UniqueName: \"kubernetes.io/projected/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-kube-api-access-7prg8\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008091 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-log-httpd\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008262 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-combined-ca-bundle\") pod \"17c8fbb1-7197-4b91-972f-1d916c09cd19\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008404 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zfwx\" (UniqueName: \"kubernetes.io/projected/17c8fbb1-7197-4b91-972f-1d916c09cd19-kube-api-access-9zfwx\") pod \"17c8fbb1-7197-4b91-972f-1d916c09cd19\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008528 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data\") pod \"17c8fbb1-7197-4b91-972f-1d916c09cd19\" (UID: \"17c8fbb1-7197-4b91-972f-1d916c09cd19\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008710 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-config-data\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.008822 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-sg-core-conf-yaml\") pod \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\" (UID: \"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de\") " Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.009397 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.009461 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.012140 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd2nz\" (UniqueName: \"kubernetes.io/projected/f0006614-96ac-4260-a959-a66db83df548-kube-api-access-bd2nz\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.012223 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.012305 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0006614-96ac-4260-a959-a66db83df548-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.012360 4626 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0006614-96ac-4260-a959-a66db83df548-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.012416 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0006614-96ac-4260-a959-a66db83df548-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.009951 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.017607 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-scripts" (OuterVolumeSpecName: "scripts") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.018176 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.022148 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "17c8fbb1-7197-4b91-972f-1d916c09cd19" (UID: "17c8fbb1-7197-4b91-972f-1d916c09cd19"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.046837 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-kube-api-access-7prg8" (OuterVolumeSpecName: "kube-api-access-7prg8") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "kube-api-access-7prg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.048123 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c8fbb1-7197-4b91-972f-1d916c09cd19-kube-api-access-9zfwx" (OuterVolumeSpecName: "kube-api-access-9zfwx") pod "17c8fbb1-7197-4b91-972f-1d916c09cd19" (UID: "17c8fbb1-7197-4b91-972f-1d916c09cd19"). InnerVolumeSpecName "kube-api-access-9zfwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.077957 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17c8fbb1-7197-4b91-972f-1d916c09cd19" (UID: "17c8fbb1-7197-4b91-972f-1d916c09cd19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.086754 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.091884 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data" (OuterVolumeSpecName: "config-data") pod "17c8fbb1-7197-4b91-972f-1d916c09cd19" (UID: "17c8fbb1-7197-4b91-972f-1d916c09cd19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.102959 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7f6ddcf74f-chh24"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114345 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114374 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114385 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114402 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prg8\" (UniqueName: \"kubernetes.io/projected/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-kube-api-access-7prg8\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114410 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114418 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114427 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zfwx\" (UniqueName: \"kubernetes.io/projected/17c8fbb1-7197-4b91-972f-1d916c09cd19-kube-api-access-9zfwx\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114437 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17c8fbb1-7197-4b91-972f-1d916c09cd19-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.114445 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: W0223 07:00:47.118148 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdb47037_af0c_4d21_9e61_53b65fb113d1.slice/crio-14469869eb6ed908ce800d316c5107d662bcdf0eb18258d5fba5ef14fcf701b5 WatchSource:0}: Error finding container 14469869eb6ed908ce800d316c5107d662bcdf0eb18258d5fba5ef14fcf701b5: Status 404 returned error can't find the container with id 14469869eb6ed908ce800d316c5107d662bcdf0eb18258d5fba5ef14fcf701b5 Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.120409 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54768cd758-d6bbc"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.166860 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-config-data" (OuterVolumeSpecName: "config-data") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.168216 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" (UID: "5c9c23ae-e41d-48eb-8d54-2887f2b0e9de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.216977 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.217015 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.500111 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9292/healthcheck\": read tcp 10.217.0.2:37912->10.217.0.158:9292: read: connection reset by peer" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.501438 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.158:9292/healthcheck\": read tcp 10.217.0.2:37906->10.217.0.158:9292: read: connection reset by peer" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.594095 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.645130 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" event={"ID":"05f9d0eb-fafc-496f-9fe2-8923f9d8db61","Type":"ContainerStarted","Data":"3df0730ce70142cc66de9de3433900912e825b6b8e36a682548c9ea2ed2798d5"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.645174 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" event={"ID":"05f9d0eb-fafc-496f-9fe2-8923f9d8db61","Type":"ContainerStarted","Data":"3aab5007b1760411f3ba7b44d25a1ad6a1c4f178da76b24c58ceba1a0d021074"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.645325 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.650531 4626 generic.go:334] "Generic (PLEG): container finished" podID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerID="d8c9929a3dcc07967d2a2cb42b1477ead3837767b756bbeab0bfbc273efc72a5" exitCode=0 Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.650620 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58f4f2ad-1b9a-4be4-a535-32183ce254a6","Type":"ContainerDied","Data":"d8c9929a3dcc07967d2a2cb42b1477ead3837767b756bbeab0bfbc273efc72a5"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.655302 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c9c23ae-e41d-48eb-8d54-2887f2b0e9de","Type":"ContainerDied","Data":"6fb4920403eb87fd81c769722049cfd430f8ec450bf1e29a6b922b78d005a8c2"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.655343 4626 scope.go:117] "RemoveContainer" containerID="38568a9892113266e45e1d53968595da05aefc6cde23102e7c8d0ade9d7d77f2" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.655463 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.660491 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54768cd758-d6bbc" event={"ID":"bdb47037-af0c-4d21-9e61-53b65fb113d1","Type":"ContainerStarted","Data":"de0cf6656a946dd67d562a18039bacbb669909e3bd087cdcda0aafb1fce14082"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.660557 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54768cd758-d6bbc" event={"ID":"bdb47037-af0c-4d21-9e61-53b65fb113d1","Type":"ContainerStarted","Data":"14469869eb6ed908ce800d316c5107d662bcdf0eb18258d5fba5ef14fcf701b5"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.661146 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.668435 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f0006614-96ac-4260-a959-a66db83df548","Type":"ContainerDied","Data":"7b3247c3ea281d8e7ce2b222eb29aed16fd28083763e6d4fa4d383c7565a291a"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.668559 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.679381 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5449db68d9-tvqls" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.679534 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5449db68d9-tvqls" event={"ID":"17c8fbb1-7197-4b91-972f-1d916c09cd19","Type":"ContainerDied","Data":"67af7f4c95de43f3b18932ed67c1782df7ac9d39c437d55e1a4cff8a83aebbcd"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.688166 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-545447d6fb-cf9md" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.689601 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-545447d6fb-cf9md" event={"ID":"aa38e624-6de3-4e41-841e-21109ef1e723","Type":"ContainerDied","Data":"dddf5c49ffa58c2a2d0ae92c22c00127bfe6cde4ae5e28e4fa1a0068ac418641"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.692875 4626 scope.go:117] "RemoveContainer" containerID="e2462c86e98b963cae55f557dabd96a717fe6c2639e81395164dba612ac53c59" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.693283 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" podStartSLOduration=8.693272702 podStartE2EDuration="8.693272702s" podCreationTimestamp="2026-02-23 07:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:47.670532566 +0000 UTC m=+1200.009861852" watchObservedRunningTime="2026-02-23 07:00:47.693272702 +0000 UTC m=+1200.032601957" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.694689 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-54768cd758-d6bbc" podStartSLOduration=8.69468269 podStartE2EDuration="8.69468269s" podCreationTimestamp="2026-02-23 07:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:47.692978337 +0000 UTC m=+1200.032307602" watchObservedRunningTime="2026-02-23 07:00:47.69468269 +0000 UTC m=+1200.034011956" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.697206 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"36431530-ec45-4670-bc2c-ababbf867d6f","Type":"ContainerStarted","Data":"da7d8393f0fd80b335f12b632786b5fd69381e2c344b44ffa6f4b8d41e77bdf6"} Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.743688 4626 scope.go:117] "RemoveContainer" containerID="4be2ac2b8a0146115cfafe4cf8e049c760b81916f143932060e48d95711b3bb8" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.747140 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.759528 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.800436 4626 scope.go:117] "RemoveContainer" containerID="ef878a1d1d88bf9bd3e350eeb007136b49032f622448769c43e8400b5a7417b4" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806137 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806626 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806644 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806664 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806670 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806684 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api-log" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806689 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api-log" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806699 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="proxy-httpd" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806705 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="proxy-httpd" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806720 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-notification-agent" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806725 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-notification-agent" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806734 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-central-agent" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806741 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-central-agent" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806750 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="sg-core" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806755 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="sg-core" Feb 23 07:00:47 crc kubenswrapper[4626]: E0223 07:00:47.806773 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806778 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806977 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-notification-agent" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806987 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="sg-core" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.806996 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" containerName="heat-cfnapi" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.807004 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.807013 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" containerName="heat-api" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.807027 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="proxy-httpd" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.807037 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="ceilometer-central-agent" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.807048 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0006614-96ac-4260-a959-a66db83df548" containerName="cinder-api-log" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.808043 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.808733 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.591696673 podStartE2EDuration="23.808716091s" podCreationTimestamp="2026-02-23 07:00:24 +0000 UTC" firstStartedPulling="2026-02-23 07:00:25.230668567 +0000 UTC m=+1177.569997833" lastFinishedPulling="2026-02-23 07:00:46.447687985 +0000 UTC m=+1198.787017251" observedRunningTime="2026-02-23 07:00:47.758080224 +0000 UTC m=+1200.097409490" watchObservedRunningTime="2026-02-23 07:00:47.808716091 +0000 UTC m=+1200.148045348" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.812119 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.812353 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.819190 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.828220 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.844623 4626 scope.go:117] "RemoveContainer" containerID="0f7e9544924a8de544e50ff8752bbcc36e4412b8d98d51da038f2153fa68a305" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.844865 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.844976 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-scripts\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845093 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgrc\" (UniqueName: \"kubernetes.io/projected/a5f9731f-2161-4757-97a7-e542f744362c-kube-api-access-2hgrc\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845169 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845220 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845253 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-config-data\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845310 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5f9731f-2161-4757-97a7-e542f744362c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845476 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5f9731f-2161-4757-97a7-e542f744362c-logs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.845536 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.880906 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.931006 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.960944 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5449db68d9-tvqls"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.972780 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5449db68d9-tvqls"] Feb 23 07:00:47 crc kubenswrapper[4626]: I0223 07:00:47.982025 4626 scope.go:117] "RemoveContainer" containerID="87705677b2dfb886b96b1b16b201b53deaeba92b52afba2c8420755e4fca47b0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007442 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgrc\" (UniqueName: \"kubernetes.io/projected/a5f9731f-2161-4757-97a7-e542f744362c-kube-api-access-2hgrc\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007579 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007613 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007641 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-config-data\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007673 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5f9731f-2161-4757-97a7-e542f744362c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007724 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5f9731f-2161-4757-97a7-e542f744362c-logs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007752 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007801 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.007843 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-scripts\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.018584 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-scripts\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.019096 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5f9731f-2161-4757-97a7-e542f744362c-logs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.019162 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5f9731f-2161-4757-97a7-e542f744362c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.040564 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.050317 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.055237 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgrc\" (UniqueName: \"kubernetes.io/projected/a5f9731f-2161-4757-97a7-e542f744362c-kube-api-access-2hgrc\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.040895 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.056014 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.067611 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-config-data\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.074976 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c8fbb1-7197-4b91-972f-1d916c09cd19" path="/var/lib/kubelet/pods/17c8fbb1-7197-4b91-972f-1d916c09cd19/volumes" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.081017 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.081255 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.081664 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" path="/var/lib/kubelet/pods/5c9c23ae-e41d-48eb-8d54-2887f2b0e9de/volumes" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.082675 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0006614-96ac-4260-a959-a66db83df548" path="/var/lib/kubelet/pods/f0006614-96ac-4260-a959-a66db83df548/volumes" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.083537 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.091088 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.091223 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.092970 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f9731f-2161-4757-97a7-e542f744362c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5f9731f-2161-4757-97a7-e542f744362c\") " pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.099603 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-545447d6fb-cf9md"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.104040 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.104119 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.133161 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.152738 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-545447d6fb-cf9md"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.200132 4626 scope.go:117] "RemoveContainer" containerID="70429a5ef50a3b0f55f67ed502e003b523eadaa7b8c2bf830093eae084a395a6" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.208155 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.237768 4626 scope.go:117] "RemoveContainer" containerID="e8e25f190c8190eb8bc86c3f0e7e205137cb11dfd1a38de4f415f99e17725246" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238252 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-run-httpd\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238422 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-scripts\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238518 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238560 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786k8\" (UniqueName: \"kubernetes.io/projected/a57d585c-03c6-4731-8271-2398c0d774ad-kube-api-access-786k8\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238643 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-config-data\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238696 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.238751 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-log-httpd\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.340405 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-internal-tls-certs\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.340590 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-httpd-run\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.340676 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-config-data\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.340789 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j8gd\" (UniqueName: \"kubernetes.io/projected/58f4f2ad-1b9a-4be4-a535-32183ce254a6-kube-api-access-8j8gd\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.345652 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-logs\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.345689 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.345734 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-scripts\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.345761 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-combined-ca-bundle\") pod \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\" (UID: \"58f4f2ad-1b9a-4be4-a535-32183ce254a6\") " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346256 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-scripts\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346407 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346465 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786k8\" (UniqueName: \"kubernetes.io/projected/a57d585c-03c6-4731-8271-2398c0d774ad-kube-api-access-786k8\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346582 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-config-data\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346662 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346713 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-log-httpd\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.346764 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-run-httpd\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.347275 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-run-httpd\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.341720 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.348889 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-logs" (OuterVolumeSpecName: "logs") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.353484 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-log-httpd\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.363966 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-scripts" (OuterVolumeSpecName: "scripts") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.364122 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f4f2ad-1b9a-4be4-a535-32183ce254a6-kube-api-access-8j8gd" (OuterVolumeSpecName: "kube-api-access-8j8gd") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "kube-api-access-8j8gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.364329 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-scripts\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.368977 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.372973 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.374297 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.383472 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786k8\" (UniqueName: \"kubernetes.io/projected/a57d585c-03c6-4731-8271-2398c0d774ad-kube-api-access-786k8\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.392321 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-config-data\") pod \"ceilometer-0\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.423438 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.430725 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-config-data" (OuterVolumeSpecName: "config-data") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452321 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452351 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452361 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j8gd\" (UniqueName: \"kubernetes.io/projected/58f4f2ad-1b9a-4be4-a535-32183ce254a6-kube-api-access-8j8gd\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452383 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452393 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58f4f2ad-1b9a-4be4-a535-32183ce254a6-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452401 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.452409 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.456000 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.474640 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58f4f2ad-1b9a-4be4-a535-32183ce254a6" (UID: "58f4f2ad-1b9a-4be4-a535-32183ce254a6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.493129 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.554806 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.555066 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58f4f2ad-1b9a-4be4-a535-32183ce254a6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.706703 4626 generic.go:334] "Generic (PLEG): container finished" podID="8624d986-dff6-40bd-937d-755c2ca809d9" containerID="a5a014efa76b7de50de6290e6814485f21418749702dd366d652bb5a811d9fcf" exitCode=137 Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.706812 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c5c7b856-snkxr" event={"ID":"8624d986-dff6-40bd-937d-755c2ca809d9","Type":"ContainerDied","Data":"a5a014efa76b7de50de6290e6814485f21418749702dd366d652bb5a811d9fcf"} Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.709485 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.709490 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58f4f2ad-1b9a-4be4-a535-32183ce254a6","Type":"ContainerDied","Data":"8fecba7cfe5a4a4fe9099363fbd84a5c4ec664248fe5fc372a89f926d575fc94"} Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.709719 4626 scope.go:117] "RemoveContainer" containerID="d8c9929a3dcc07967d2a2cb42b1477ead3837767b756bbeab0bfbc273efc72a5" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.785317 4626 scope.go:117] "RemoveContainer" containerID="13b00ec4e33abde6af8fa25378a16255506bb30adff2ddb2d6b3866f55040d08" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.794826 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.838993 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.839053 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: E0223 07:00:48.839542 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-httpd" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.839558 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-httpd" Feb 23 07:00:48 crc kubenswrapper[4626]: E0223 07:00:48.839575 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-log" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.839582 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-log" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.844070 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-log" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.844129 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" containerName="glance-httpd" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.847641 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.847770 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.850001 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.850283 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.851304 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: W0223 07:00:48.890792 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5f9731f_2161_4757_97a7_e542f744362c.slice/crio-01f6f04edb1249278f7b5bacfde6274f6833593be632b671d45a1936cd6f63df WatchSource:0}: Error finding container 01f6f04edb1249278f7b5bacfde6274f6833593be632b671d45a1936cd6f63df: Status 404 returned error can't find the container with id 01f6f04edb1249278f7b5bacfde6274f6833593be632b671d45a1936cd6f63df Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.946363 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.964032 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.976205 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.976286 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7afc12b1-f684-47fc-bb2f-201f09707ad6-logs\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.976610 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.976750 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7afc12b1-f684-47fc-bb2f-201f09707ad6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.977005 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.977040 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.977069 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg27r\" (UniqueName: \"kubernetes.io/projected/7afc12b1-f684-47fc-bb2f-201f09707ad6-kube-api-access-gg27r\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:48 crc kubenswrapper[4626]: I0223 07:00:48.977171 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.080213 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-config-data\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.080526 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt9ql\" (UniqueName: \"kubernetes.io/projected/8624d986-dff6-40bd-937d-755c2ca809d9-kube-api-access-lt9ql\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.080592 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-scripts\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.080633 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-secret-key\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.080672 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-tls-certs\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.081429 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-combined-ca-bundle\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.082084 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8624d986-dff6-40bd-937d-755c2ca809d9-logs\") pod \"8624d986-dff6-40bd-937d-755c2ca809d9\" (UID: \"8624d986-dff6-40bd-937d-755c2ca809d9\") " Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.083205 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8624d986-dff6-40bd-937d-755c2ca809d9-logs" (OuterVolumeSpecName: "logs") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.099264 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7afc12b1-f684-47fc-bb2f-201f09707ad6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.098586 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7afc12b1-f684-47fc-bb2f-201f09707ad6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.106720 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.106779 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.106813 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg27r\" (UniqueName: \"kubernetes.io/projected/7afc12b1-f684-47fc-bb2f-201f09707ad6-kube-api-access-gg27r\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.106932 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.107079 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.107163 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7afc12b1-f684-47fc-bb2f-201f09707ad6-logs\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.107203 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.107375 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8624d986-dff6-40bd-937d-755c2ca809d9-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.108287 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.108751 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8624d986-dff6-40bd-937d-755c2ca809d9-kube-api-access-lt9ql" (OuterVolumeSpecName: "kube-api-access-lt9ql") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "kube-api-access-lt9ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.109439 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7afc12b1-f684-47fc-bb2f-201f09707ad6-logs\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.115236 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.115799 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.121789 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-config-data" (OuterVolumeSpecName: "config-data") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.125322 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg27r\" (UniqueName: \"kubernetes.io/projected/7afc12b1-f684-47fc-bb2f-201f09707ad6-kube-api-access-gg27r\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.128446 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-scripts" (OuterVolumeSpecName: "scripts") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.160043 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.161034 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.162468 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7afc12b1-f684-47fc-bb2f-201f09707ad6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.184612 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.188691 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"7afc12b1-f684-47fc-bb2f-201f09707ad6\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.191714 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.208875 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.208902 4626 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.208914 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.208922 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8624d986-dff6-40bd-937d-755c2ca809d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.208932 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt9ql\" (UniqueName: \"kubernetes.io/projected/8624d986-dff6-40bd-937d-755c2ca809d9-kube-api-access-lt9ql\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.215815 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8624d986-dff6-40bd-937d-755c2ca809d9" (UID: "8624d986-dff6-40bd-937d-755c2ca809d9"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.310405 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.310656 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-log" containerID="cri-o://7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160" gracePeriod=30 Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.311099 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-httpd" containerID="cri-o://968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912" gracePeriod=30 Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.319099 4626 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8624d986-dff6-40bd-937d-755c2ca809d9-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.731795 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"70eb0f85-ccb5-4ba7-b3bd-f586483ca336","Type":"ContainerDied","Data":"7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160"} Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.731679 4626 generic.go:334] "Generic (PLEG): container finished" podID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerID="7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160" exitCode=143 Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.737488 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerStarted","Data":"5d24ab5edec714694b5fe2107c61a1029fe66f11e38cd4436db8e0eb2ef49cba"} Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.743056 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5f9731f-2161-4757-97a7-e542f744362c","Type":"ContainerStarted","Data":"4ecaeb29d55f25c1512ffad33b423aeb32233b6144a0888f7c9d87f58afeeb00"} Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.743101 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5f9731f-2161-4757-97a7-e542f744362c","Type":"ContainerStarted","Data":"01f6f04edb1249278f7b5bacfde6274f6833593be632b671d45a1936cd6f63df"} Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.759365 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c5c7b856-snkxr" event={"ID":"8624d986-dff6-40bd-937d-755c2ca809d9","Type":"ContainerDied","Data":"d1466fa67cdc3b13ffb527d31f4d6111aa7b04af96e278d8aaae713591e1e044"} Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.759461 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c5c7b856-snkxr" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.759541 4626 scope.go:117] "RemoveContainer" containerID="60f2992022985555c9b19a0c7fb549a7d54427452a1fd539453a54ef31a10a6b" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.815307 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9c5c7b856-snkxr"] Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.831702 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9c5c7b856-snkxr"] Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.915483 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.981271 4626 scope.go:117] "RemoveContainer" containerID="a5a014efa76b7de50de6290e6814485f21418749702dd366d652bb5a811d9fcf" Feb 23 07:00:49 crc kubenswrapper[4626]: I0223 07:00:49.983125 4626 scope.go:117] "RemoveContainer" containerID="7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28" Feb 23 07:00:49 crc kubenswrapper[4626]: W0223 07:00:49.986229 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7afc12b1_f684_47fc_bb2f_201f09707ad6.slice/crio-6d07a41b3a7b79e9ab41f1c651c6c1dbb58fe8a69b139e5fbc535f3f8e349135 WatchSource:0}: Error finding container 6d07a41b3a7b79e9ab41f1c651c6c1dbb58fe8a69b139e5fbc535f3f8e349135: Status 404 returned error can't find the container with id 6d07a41b3a7b79e9ab41f1c651c6c1dbb58fe8a69b139e5fbc535f3f8e349135 Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.019620 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f4f2ad-1b9a-4be4-a535-32183ce254a6" path="/var/lib/kubelet/pods/58f4f2ad-1b9a-4be4-a535-32183ce254a6/volumes" Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.020595 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" path="/var/lib/kubelet/pods/8624d986-dff6-40bd-937d-755c2ca809d9/volumes" Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.021187 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa38e624-6de3-4e41-841e-21109ef1e723" path="/var/lib/kubelet/pods/aa38e624-6de3-4e41-841e-21109ef1e723/volumes" Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.813966 4626 generic.go:334] "Generic (PLEG): container finished" podID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerID="2ae0c506b97fe511c8b662a87402ae3a36e67687f2e8f672b90b6cca99dc3cd3" exitCode=1 Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.814031 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65db596798-kdkj2" event={"ID":"1c0ed8c1-9098-4598-8390-ed8c709fa057","Type":"ContainerDied","Data":"2ae0c506b97fe511c8b662a87402ae3a36e67687f2e8f672b90b6cca99dc3cd3"} Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.814270 4626 scope.go:117] "RemoveContainer" containerID="7166a173be963bb9807b17c0a71a6fd8548daf87a8e39e9046af32464079eb28" Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.814698 4626 scope.go:117] "RemoveContainer" containerID="2ae0c506b97fe511c8b662a87402ae3a36e67687f2e8f672b90b6cca99dc3cd3" Feb 23 07:00:50 crc kubenswrapper[4626]: E0223 07:00:50.814904 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 20s restarting failed container=heat-cfnapi pod=heat-cfnapi-65db596798-kdkj2_openstack(1c0ed8c1-9098-4598-8390-ed8c709fa057)\"" pod="openstack/heat-cfnapi-65db596798-kdkj2" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.817556 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7afc12b1-f684-47fc-bb2f-201f09707ad6","Type":"ContainerStarted","Data":"6d07a41b3a7b79e9ab41f1c651c6c1dbb58fe8a69b139e5fbc535f3f8e349135"} Feb 23 07:00:50 crc kubenswrapper[4626]: I0223 07:00:50.820538 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerStarted","Data":"87a2d96e8889ccec5a47a2499e0061f8ca3ab378201ef69903e153954cd8033c"} Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.865578 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerStarted","Data":"4df5fa9500589425421a100d8b1872a109d6923e6e636cd46219731c2af54a0b"} Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.879055 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5f9731f-2161-4757-97a7-e542f744362c","Type":"ContainerStarted","Data":"19a1f27198f7b1bf80218af61fc31f9097aef98b15fc58b03acaf5e2fede8381"} Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.879266 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.892538 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7afc12b1-f684-47fc-bb2f-201f09707ad6","Type":"ContainerStarted","Data":"4e0816bd8d97fe87997dae4c447511c2d4487c65072b8a4ed01a83529e9132f2"} Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.892670 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7afc12b1-f684-47fc-bb2f-201f09707ad6","Type":"ContainerStarted","Data":"4edc9ced7cf43b062d14a01eb24ad99a105e81955bc5e95e7b7d06853abbaf26"} Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.908244 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.908229961 podStartE2EDuration="4.908229961s" podCreationTimestamp="2026-02-23 07:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:51.89579939 +0000 UTC m=+1204.235128656" watchObservedRunningTime="2026-02-23 07:00:51.908229961 +0000 UTC m=+1204.247559227" Feb 23 07:00:51 crc kubenswrapper[4626]: I0223 07:00:51.932854 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.932840582 podStartE2EDuration="3.932840582s" podCreationTimestamp="2026-02-23 07:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:51.921564648 +0000 UTC m=+1204.260893914" watchObservedRunningTime="2026-02-23 07:00:51.932840582 +0000 UTC m=+1204.272169849" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.480716 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tpl9d"] Feb 23 07:00:52 crc kubenswrapper[4626]: E0223 07:00:52.481433 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon-log" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.481452 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon-log" Feb 23 07:00:52 crc kubenswrapper[4626]: E0223 07:00:52.481489 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.481510 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.481724 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon-log" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.481746 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8624d986-dff6-40bd-937d-755c2ca809d9" containerName="horizon" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.482413 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.497115 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tpl9d"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.615145 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-g4nht"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.616632 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.616838 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfn5t\" (UniqueName: \"kubernetes.io/projected/45aedbbc-efbd-4bf7-bbdb-f36992267beb-kube-api-access-mfn5t\") pod \"nova-api-db-create-tpl9d\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.617095 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aedbbc-efbd-4bf7-bbdb-f36992267beb-operator-scripts\") pod \"nova-api-db-create-tpl9d\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.647144 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8656-account-create-update-4cfz9"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.649396 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.655290 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.695463 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8656-account-create-update-4cfz9"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.718782 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-operator-scripts\") pod \"nova-cell0-db-create-g4nht\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.718939 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfn5t\" (UniqueName: \"kubernetes.io/projected/45aedbbc-efbd-4bf7-bbdb-f36992267beb-kube-api-access-mfn5t\") pod \"nova-api-db-create-tpl9d\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.719094 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aedbbc-efbd-4bf7-bbdb-f36992267beb-operator-scripts\") pod \"nova-api-db-create-tpl9d\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.719116 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lv7s\" (UniqueName: \"kubernetes.io/projected/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-kube-api-access-8lv7s\") pod \"nova-cell0-db-create-g4nht\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.720251 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aedbbc-efbd-4bf7-bbdb-f36992267beb-operator-scripts\") pod \"nova-api-db-create-tpl9d\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.744572 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-g4nht"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.759401 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-c8mts"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.767656 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.767990 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfn5t\" (UniqueName: \"kubernetes.io/projected/45aedbbc-efbd-4bf7-bbdb-f36992267beb-kube-api-access-mfn5t\") pod \"nova-api-db-create-tpl9d\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.797017 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-c8mts"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.801644 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.821191 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lv7s\" (UniqueName: \"kubernetes.io/projected/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-kube-api-access-8lv7s\") pod \"nova-cell0-db-create-g4nht\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.821240 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-operator-scripts\") pod \"nova-cell0-db-create-g4nht\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.821275 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-operator-scripts\") pod \"nova-api-8656-account-create-update-4cfz9\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.821295 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxt2n\" (UniqueName: \"kubernetes.io/projected/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-kube-api-access-bxt2n\") pod \"nova-api-8656-account-create-update-4cfz9\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.822155 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-operator-scripts\") pod \"nova-cell0-db-create-g4nht\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.843018 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9b56-account-create-update-gxhrg"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.844286 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.851390 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.852961 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lv7s\" (UniqueName: \"kubernetes.io/projected/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-kube-api-access-8lv7s\") pod \"nova-cell0-db-create-g4nht\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.917823 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9b56-account-create-update-gxhrg"] Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.923120 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng96q\" (UniqueName: \"kubernetes.io/projected/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-kube-api-access-ng96q\") pod \"nova-cell1-db-create-c8mts\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.923234 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-operator-scripts\") pod \"nova-api-8656-account-create-update-4cfz9\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.923260 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxt2n\" (UniqueName: \"kubernetes.io/projected/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-kube-api-access-bxt2n\") pod \"nova-api-8656-account-create-update-4cfz9\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.923400 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-operator-scripts\") pod \"nova-cell1-db-create-c8mts\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.924214 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-operator-scripts\") pod \"nova-api-8656-account-create-update-4cfz9\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.932475 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.948451 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerStarted","Data":"ba7421a8b00f3e04a25123b3aa361763691859e236695fe28a75881e196e0b0d"} Feb 23 07:00:52 crc kubenswrapper[4626]: I0223 07:00:52.966989 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxt2n\" (UniqueName: \"kubernetes.io/projected/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-kube-api-access-bxt2n\") pod \"nova-api-8656-account-create-update-4cfz9\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.042678 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-operator-scripts\") pod \"nova-cell1-db-create-c8mts\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.042776 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9cjf\" (UniqueName: \"kubernetes.io/projected/31c479ac-ec69-4792-848a-20a6e6e92ee1-kube-api-access-r9cjf\") pod \"nova-cell0-9b56-account-create-update-gxhrg\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.042864 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng96q\" (UniqueName: \"kubernetes.io/projected/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-kube-api-access-ng96q\") pod \"nova-cell1-db-create-c8mts\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.042907 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31c479ac-ec69-4792-848a-20a6e6e92ee1-operator-scripts\") pod \"nova-cell0-9b56-account-create-update-gxhrg\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.056958 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-operator-scripts\") pod \"nova-cell1-db-create-c8mts\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.093315 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng96q\" (UniqueName: \"kubernetes.io/projected/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-kube-api-access-ng96q\") pod \"nova-cell1-db-create-c8mts\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.128690 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.176843 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9cjf\" (UniqueName: \"kubernetes.io/projected/31c479ac-ec69-4792-848a-20a6e6e92ee1-kube-api-access-r9cjf\") pod \"nova-cell0-9b56-account-create-update-gxhrg\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.177102 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31c479ac-ec69-4792-848a-20a6e6e92ee1-operator-scripts\") pod \"nova-cell0-9b56-account-create-update-gxhrg\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.179354 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e978-account-create-update-5t98k"] Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.180860 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.182750 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31c479ac-ec69-4792-848a-20a6e6e92ee1-operator-scripts\") pod \"nova-cell0-9b56-account-create-update-gxhrg\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.192433 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.193553 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e978-account-create-update-5t98k"] Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.212079 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9cjf\" (UniqueName: \"kubernetes.io/projected/31c479ac-ec69-4792-848a-20a6e6e92ee1-kube-api-access-r9cjf\") pod \"nova-cell0-9b56-account-create-update-gxhrg\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.244123 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.264919 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.279793 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-operator-scripts\") pod \"nova-cell1-e978-account-create-update-5t98k\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.279958 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6lg\" (UniqueName: \"kubernetes.io/projected/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-kube-api-access-bc6lg\") pod \"nova-cell1-e978-account-create-update-5t98k\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.382885 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-operator-scripts\") pod \"nova-cell1-e978-account-create-update-5t98k\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.383038 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6lg\" (UniqueName: \"kubernetes.io/projected/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-kube-api-access-bc6lg\") pod \"nova-cell1-e978-account-create-update-5t98k\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.384395 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-operator-scripts\") pod \"nova-cell1-e978-account-create-update-5t98k\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.402218 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6lg\" (UniqueName: \"kubernetes.io/projected/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-kube-api-access-bc6lg\") pod \"nova-cell1-e978-account-create-update-5t98k\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.483887 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tpl9d"] Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.520763 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.736446 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-g4nht"] Feb 23 07:00:53 crc kubenswrapper[4626]: W0223 07:00:53.753375 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e37d982_ad74_4cac_bd65_fc40c41f4dc5.slice/crio-df0c3322efb1dba8173b57788d034d334b64e82c76bcd58116da9d00098aac5a WatchSource:0}: Error finding container df0c3322efb1dba8173b57788d034d334b64e82c76bcd58116da9d00098aac5a: Status 404 returned error can't find the container with id df0c3322efb1dba8173b57788d034d334b64e82c76bcd58116da9d00098aac5a Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.937108 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.967368 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-g4nht" event={"ID":"1e37d982-ad74-4cac-bd65-fc40c41f4dc5","Type":"ContainerStarted","Data":"df0c3322efb1dba8173b57788d034d334b64e82c76bcd58116da9d00098aac5a"} Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.983974 4626 generic.go:334] "Generic (PLEG): container finished" podID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerID="968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912" exitCode=0 Feb 23 07:00:53 crc kubenswrapper[4626]: I0223 07:00:53.984188 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.003339 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"70eb0f85-ccb5-4ba7-b3bd-f586483ca336","Type":"ContainerDied","Data":"968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912"} Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.003375 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"70eb0f85-ccb5-4ba7-b3bd-f586483ca336","Type":"ContainerDied","Data":"c5518656b5fe263d3d3fe173e4f6fdabfe62cfa97711a06da45ab6efeeab0898"} Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.003394 4626 scope.go:117] "RemoveContainer" containerID="968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.006474 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerStarted","Data":"4a72a2626560273a8c17060205a4b92de7f8e7aa4470ca12686eb935d791dc8c"} Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.007450 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.012265 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tpl9d" event={"ID":"45aedbbc-efbd-4bf7-bbdb-f36992267beb","Type":"ContainerStarted","Data":"61dcad718ea2ed55774dbe7a900419ad235fe06de3c4d283f77fd07de564bc1c"} Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.029305 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.645070294 podStartE2EDuration="7.029293228s" podCreationTimestamp="2026-02-23 07:00:47 +0000 UTC" firstStartedPulling="2026-02-23 07:00:48.945384563 +0000 UTC m=+1201.284713829" lastFinishedPulling="2026-02-23 07:00:53.329607497 +0000 UTC m=+1205.668936763" observedRunningTime="2026-02-23 07:00:54.025823909 +0000 UTC m=+1206.365153175" watchObservedRunningTime="2026-02-23 07:00:54.029293228 +0000 UTC m=+1206.368622494" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.032781 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-httpd-run\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033025 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033150 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-scripts\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033299 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-logs\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033412 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxx84\" (UniqueName: \"kubernetes.io/projected/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-kube-api-access-kxx84\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033527 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-config-data\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033627 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-public-tls-certs\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.033749 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-combined-ca-bundle\") pod \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\" (UID: \"70eb0f85-ccb5-4ba7-b3bd-f586483ca336\") " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.049383 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.051055 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-logs" (OuterVolumeSpecName: "logs") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.056371 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.068402 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-kube-api-access-kxx84" (OuterVolumeSpecName: "kube-api-access-kxx84") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "kube-api-access-kxx84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.078125 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-scripts" (OuterVolumeSpecName: "scripts") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.091805 4626 scope.go:117] "RemoveContainer" containerID="7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.122287 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.137702 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxx84\" (UniqueName: \"kubernetes.io/projected/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-kube-api-access-kxx84\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.137734 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.137746 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.137774 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.137783 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.137793 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.157609 4626 scope.go:117] "RemoveContainer" containerID="968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912" Feb 23 07:00:54 crc kubenswrapper[4626]: E0223 07:00:54.165639 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912\": container with ID starting with 968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912 not found: ID does not exist" containerID="968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.165683 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912"} err="failed to get container status \"968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912\": rpc error: code = NotFound desc = could not find container \"968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912\": container with ID starting with 968e6cd65317a53095acbef87069907fc4719107573c0eb974222896e5218912 not found: ID does not exist" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.165708 4626 scope.go:117] "RemoveContainer" containerID="7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160" Feb 23 07:00:54 crc kubenswrapper[4626]: E0223 07:00:54.167193 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160\": container with ID starting with 7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160 not found: ID does not exist" containerID="7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.167315 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160"} err="failed to get container status \"7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160\": rpc error: code = NotFound desc = could not find container \"7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160\": container with ID starting with 7c60556e5fc3ae0b41ae50c38282ebe8af697a07bf1c7dc3f84d21697912b160 not found: ID does not exist" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.172266 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-c8mts"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.183050 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.212060 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.230592 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-config-data" (OuterVolumeSpecName: "config-data") pod "70eb0f85-ccb5-4ba7-b3bd-f586483ca336" (UID: "70eb0f85-ccb5-4ba7-b3bd-f586483ca336"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.240123 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.240149 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.240159 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70eb0f85-ccb5-4ba7-b3bd-f586483ca336-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.350360 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9b56-account-create-update-gxhrg"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.415989 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8656-account-create-update-4cfz9"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.444106 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.462699 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.484740 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:00:54 crc kubenswrapper[4626]: E0223 07:00:54.485289 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-httpd" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.485309 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-httpd" Feb 23 07:00:54 crc kubenswrapper[4626]: E0223 07:00:54.485325 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-log" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.485332 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-log" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.488151 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-log" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.488199 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" containerName="glance-httpd" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.492772 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.497707 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.498694 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.498875 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.509402 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e978-account-create-update-5t98k"] Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553611 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553668 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7x8n\" (UniqueName: \"kubernetes.io/projected/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-kube-api-access-h7x8n\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553731 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-logs\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553858 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553882 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553902 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553921 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.553989 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655154 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655196 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655217 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655234 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655279 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655326 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655349 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7x8n\" (UniqueName: \"kubernetes.io/projected/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-kube-api-access-h7x8n\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655378 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-logs\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.655784 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-logs\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.656581 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.657364 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.661480 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.673186 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.674055 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.678472 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.693187 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7x8n\" (UniqueName: \"kubernetes.io/projected/e2fc61e5-419c-4dab-9ddb-52bb9de855d5-kube-api-access-h7x8n\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.706317 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e2fc61e5-419c-4dab-9ddb-52bb9de855d5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.717279 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.717592 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.718255 4626 scope.go:117] "RemoveContainer" containerID="2ae0c506b97fe511c8b662a87402ae3a36e67687f2e8f672b90b6cca99dc3cd3" Feb 23 07:00:54 crc kubenswrapper[4626]: E0223 07:00:54.718679 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 20s restarting failed container=heat-cfnapi pod=heat-cfnapi-65db596798-kdkj2_openstack(1c0ed8c1-9098-4598-8390-ed8c709fa057)\"" pod="openstack/heat-cfnapi-65db596798-kdkj2" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.840525 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:00:54 crc kubenswrapper[4626]: I0223 07:00:54.983311 4626 scope.go:117] "RemoveContainer" containerID="8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.031690 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5d7b45f997-g8dhd" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.057990 4626 generic.go:334] "Generic (PLEG): container finished" podID="3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" containerID="95eb52a0e57988e2c3636f5e12d7ec782f7c1266674e9d9dbeaabcdf5cfda961" exitCode=0 Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.058157 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c8mts" event={"ID":"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5","Type":"ContainerDied","Data":"95eb52a0e57988e2c3636f5e12d7ec782f7c1266674e9d9dbeaabcdf5cfda961"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.058255 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c8mts" event={"ID":"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5","Type":"ContainerStarted","Data":"37df140e8c5cb5c50c44a5f37b60c6d8179daec53173535e485e9ab015e96e63"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.069527 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e978-account-create-update-5t98k" event={"ID":"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c","Type":"ContainerStarted","Data":"c457b95e7a4ecfa6eb5de8a1deb3c5e649f263f1853b47d66f706143fa3becd4"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.069574 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e978-account-create-update-5t98k" event={"ID":"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c","Type":"ContainerStarted","Data":"a1017b4b6b7e5c24963be098658a9e7117a576bc5be6656e9c9023fb4b329c78"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.115726 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" event={"ID":"31c479ac-ec69-4792-848a-20a6e6e92ee1","Type":"ContainerStarted","Data":"6df77acb04c766991ba70857651cdb2310fcb94c2e083fd9ac18e6d978a291f9"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.115777 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" event={"ID":"31c479ac-ec69-4792-848a-20a6e6e92ee1","Type":"ContainerStarted","Data":"af9546450d864c919a88154c80ff6d73de826ef55287d04818c000cfafca1373"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.128299 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-655f59485b-t5d4q"] Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.128547 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-655f59485b-t5d4q" podUID="774d5101-24e2-4871-9a1a-f136698cf092" containerName="heat-engine" containerID="cri-o://04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" gracePeriod=60 Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.141315 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8656-account-create-update-4cfz9" event={"ID":"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86","Type":"ContainerStarted","Data":"249c7803947ff2d7ef7677d7765267cd90c6832e69f6ae1cb298e0c245908d93"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.141349 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8656-account-create-update-4cfz9" event={"ID":"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86","Type":"ContainerStarted","Data":"c4958850fab7b0d83a03312d447d9321d486218ea5d7fd5ed7b5595845d3dd2a"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.152511 4626 generic.go:334] "Generic (PLEG): container finished" podID="45aedbbc-efbd-4bf7-bbdb-f36992267beb" containerID="d1d0db67d8f6d1719fb0fed314f5847de97023960d6f432e4a6272a4ae8aa88d" exitCode=0 Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.152756 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tpl9d" event={"ID":"45aedbbc-efbd-4bf7-bbdb-f36992267beb","Type":"ContainerDied","Data":"d1d0db67d8f6d1719fb0fed314f5847de97023960d6f432e4a6272a4ae8aa88d"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.165044 4626 generic.go:334] "Generic (PLEG): container finished" podID="1e37d982-ad74-4cac-bd65-fc40c41f4dc5" containerID="3753ad5d94d4b119cfca356771f9b02e62b79b409ad5d5edc17146e84a4ed967" exitCode=0 Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.165703 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-g4nht" event={"ID":"1e37d982-ad74-4cac-bd65-fc40c41f4dc5","Type":"ContainerDied","Data":"3753ad5d94d4b119cfca356771f9b02e62b79b409ad5d5edc17146e84a4ed967"} Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.165972 4626 scope.go:117] "RemoveContainer" containerID="2ae0c506b97fe511c8b662a87402ae3a36e67687f2e8f672b90b6cca99dc3cd3" Feb 23 07:00:55 crc kubenswrapper[4626]: E0223 07:00:55.166298 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 20s restarting failed container=heat-cfnapi pod=heat-cfnapi-65db596798-kdkj2_openstack(1c0ed8c1-9098-4598-8390-ed8c709fa057)\"" pod="openstack/heat-cfnapi-65db596798-kdkj2" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.172188 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-e978-account-create-update-5t98k" podStartSLOduration=2.172175573 podStartE2EDuration="2.172175573s" podCreationTimestamp="2026-02-23 07:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:55.126249112 +0000 UTC m=+1207.465578377" watchObservedRunningTime="2026-02-23 07:00:55.172175573 +0000 UTC m=+1207.511504839" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.193464 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" podStartSLOduration=3.193440888 podStartE2EDuration="3.193440888s" podCreationTimestamp="2026-02-23 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:55.152770725 +0000 UTC m=+1207.492099991" watchObservedRunningTime="2026-02-23 07:00:55.193440888 +0000 UTC m=+1207.532770153" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.236200 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8656-account-create-update-4cfz9" podStartSLOduration=3.236177135 podStartE2EDuration="3.236177135s" podCreationTimestamp="2026-02-23 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:55.201735974 +0000 UTC m=+1207.541065240" watchObservedRunningTime="2026-02-23 07:00:55.236177135 +0000 UTC m=+1207.575506401" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.685907 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.686344 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.686522 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.687551 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5470378b5d8c9e6d24ed5c140362129ba764fb086113e814588933f56ca4c24"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.687664 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://a5470378b5d8c9e6d24ed5c140362129ba764fb086113e814588933f56ca4c24" gracePeriod=600 Feb 23 07:00:55 crc kubenswrapper[4626]: I0223 07:00:55.734419 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.012583 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70eb0f85-ccb5-4ba7-b3bd-f586483ca336" path="/var/lib/kubelet/pods/70eb0f85-ccb5-4ba7-b3bd-f586483ca336/volumes" Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.202153 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2fc61e5-419c-4dab-9ddb-52bb9de855d5","Type":"ContainerStarted","Data":"f684b4918a5f09a67cb1b11bb66b0ea553fc2e77e0981e6c4b47a96e1cbfe24e"} Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.210827 4626 generic.go:334] "Generic (PLEG): container finished" podID="1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" containerID="249c7803947ff2d7ef7677d7765267cd90c6832e69f6ae1cb298e0c245908d93" exitCode=0 Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.210881 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8656-account-create-update-4cfz9" event={"ID":"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86","Type":"ContainerDied","Data":"249c7803947ff2d7ef7677d7765267cd90c6832e69f6ae1cb298e0c245908d93"} Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.214084 4626 generic.go:334] "Generic (PLEG): container finished" podID="571961b2-ee38-4d10-a958-4157c3624ec2" containerID="35fa4a011f3cab1a6817a10b32cfc7e927de30733ca30319a4b41c7dead7b6b0" exitCode=1 Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.214128 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-675458dc64-5m4q9" event={"ID":"571961b2-ee38-4d10-a958-4157c3624ec2","Type":"ContainerDied","Data":"35fa4a011f3cab1a6817a10b32cfc7e927de30733ca30319a4b41c7dead7b6b0"} Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.214156 4626 scope.go:117] "RemoveContainer" containerID="8f2821699d14eb43e02ba8822dad12c1d52ba059ce33cbad7c6affe4981980c9" Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.215025 4626 scope.go:117] "RemoveContainer" containerID="35fa4a011f3cab1a6817a10b32cfc7e927de30733ca30319a4b41c7dead7b6b0" Feb 23 07:00:56 crc kubenswrapper[4626]: E0223 07:00:56.215340 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 20s restarting failed container=heat-api pod=heat-api-675458dc64-5m4q9_openstack(571961b2-ee38-4d10-a958-4157c3624ec2)\"" pod="openstack/heat-api-675458dc64-5m4q9" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.255618 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="a5470378b5d8c9e6d24ed5c140362129ba764fb086113e814588933f56ca4c24" exitCode=0 Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.255672 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"a5470378b5d8c9e6d24ed5c140362129ba764fb086113e814588933f56ca4c24"} Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.280768 4626 generic.go:334] "Generic (PLEG): container finished" podID="d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" containerID="c457b95e7a4ecfa6eb5de8a1deb3c5e649f263f1853b47d66f706143fa3becd4" exitCode=0 Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.280943 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e978-account-create-update-5t98k" event={"ID":"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c","Type":"ContainerDied","Data":"c457b95e7a4ecfa6eb5de8a1deb3c5e649f263f1853b47d66f706143fa3becd4"} Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.283905 4626 generic.go:334] "Generic (PLEG): container finished" podID="31c479ac-ec69-4792-848a-20a6e6e92ee1" containerID="6df77acb04c766991ba70857651cdb2310fcb94c2e083fd9ac18e6d978a291f9" exitCode=0 Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.284009 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" event={"ID":"31c479ac-ec69-4792-848a-20a6e6e92ee1","Type":"ContainerDied","Data":"6df77acb04c766991ba70857651cdb2310fcb94c2e083fd9ac18e6d978a291f9"} Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.445796 4626 scope.go:117] "RemoveContainer" containerID="3524b229b91a471ae481a5ae0ae84698672cefa6dfcb3ba75eb9b84f9fff35b3" Feb 23 07:00:56 crc kubenswrapper[4626]: I0223 07:00:56.957901 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.069929 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng96q\" (UniqueName: \"kubernetes.io/projected/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-kube-api-access-ng96q\") pod \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.071313 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-operator-scripts\") pod \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\" (UID: \"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5\") " Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.072393 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" (UID: "3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.074415 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.077685 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-kube-api-access-ng96q" (OuterVolumeSpecName: "kube-api-access-ng96q") pod "3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" (UID: "3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5"). InnerVolumeSpecName "kube-api-access-ng96q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.081723 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.092126 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.175557 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aedbbc-efbd-4bf7-bbdb-f36992267beb-operator-scripts\") pod \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.175681 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfn5t\" (UniqueName: \"kubernetes.io/projected/45aedbbc-efbd-4bf7-bbdb-f36992267beb-kube-api-access-mfn5t\") pod \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\" (UID: \"45aedbbc-efbd-4bf7-bbdb-f36992267beb\") " Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.175707 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-operator-scripts\") pod \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.175974 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lv7s\" (UniqueName: \"kubernetes.io/projected/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-kube-api-access-8lv7s\") pod \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\" (UID: \"1e37d982-ad74-4cac-bd65-fc40c41f4dc5\") " Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.176708 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng96q\" (UniqueName: \"kubernetes.io/projected/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5-kube-api-access-ng96q\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.178902 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e37d982-ad74-4cac-bd65-fc40c41f4dc5" (UID: "1e37d982-ad74-4cac-bd65-fc40c41f4dc5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.178932 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45aedbbc-efbd-4bf7-bbdb-f36992267beb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45aedbbc-efbd-4bf7-bbdb-f36992267beb" (UID: "45aedbbc-efbd-4bf7-bbdb-f36992267beb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.180470 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45aedbbc-efbd-4bf7-bbdb-f36992267beb-kube-api-access-mfn5t" (OuterVolumeSpecName: "kube-api-access-mfn5t") pod "45aedbbc-efbd-4bf7-bbdb-f36992267beb" (UID: "45aedbbc-efbd-4bf7-bbdb-f36992267beb"). InnerVolumeSpecName "kube-api-access-mfn5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.182468 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-kube-api-access-8lv7s" (OuterVolumeSpecName: "kube-api-access-8lv7s") pod "1e37d982-ad74-4cac-bd65-fc40c41f4dc5" (UID: "1e37d982-ad74-4cac-bd65-fc40c41f4dc5"). InnerVolumeSpecName "kube-api-access-8lv7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.233215 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-54768cd758-d6bbc" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.279143 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfn5t\" (UniqueName: \"kubernetes.io/projected/45aedbbc-efbd-4bf7-bbdb-f36992267beb-kube-api-access-mfn5t\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.279170 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.279182 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lv7s\" (UniqueName: \"kubernetes.io/projected/1e37d982-ad74-4cac-bd65-fc40c41f4dc5-kube-api-access-8lv7s\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.279205 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aedbbc-efbd-4bf7-bbdb-f36992267beb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.323055 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-675458dc64-5m4q9"] Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.356865 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-c8mts" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.360661 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-c8mts" event={"ID":"3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5","Type":"ContainerDied","Data":"37df140e8c5cb5c50c44a5f37b60c6d8179daec53173535e485e9ab015e96e63"} Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.360727 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37df140e8c5cb5c50c44a5f37b60c6d8179daec53173535e485e9ab015e96e63" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.379705 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"3ee7851d94adc9620d81b6807bd44d726c23bf9b55ab16cf5c2f4335cb177b50"} Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.424855 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2fc61e5-419c-4dab-9ddb-52bb9de855d5","Type":"ContainerStarted","Data":"3280db7927773b066361f184bf782cfed74a99d3ad41c644d7b893bf0e1ee6a0"} Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.442230 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tpl9d" event={"ID":"45aedbbc-efbd-4bf7-bbdb-f36992267beb","Type":"ContainerDied","Data":"61dcad718ea2ed55774dbe7a900419ad235fe06de3c4d283f77fd07de564bc1c"} Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.442280 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61dcad718ea2ed55774dbe7a900419ad235fe06de3c4d283f77fd07de564bc1c" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.442352 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tpl9d" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.461899 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-g4nht" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.464171 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-g4nht" event={"ID":"1e37d982-ad74-4cac-bd65-fc40c41f4dc5","Type":"ContainerDied","Data":"df0c3322efb1dba8173b57788d034d334b64e82c76bcd58116da9d00098aac5a"} Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.464214 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0c3322efb1dba8173b57788d034d334b64e82c76bcd58116da9d00098aac5a" Feb 23 07:00:57 crc kubenswrapper[4626]: E0223 07:00:57.570450 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 07:00:57 crc kubenswrapper[4626]: E0223 07:00:57.589614 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 07:00:57 crc kubenswrapper[4626]: E0223 07:00:57.614245 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 23 07:00:57 crc kubenswrapper[4626]: E0223 07:00:57.614306 4626 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-655f59485b-t5d4q" podUID="774d5101-24e2-4871-9a1a-f136698cf092" containerName="heat-engine" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.750485 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7f6ddcf74f-chh24" Feb 23 07:00:57 crc kubenswrapper[4626]: I0223 07:00:57.828693 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-65db596798-kdkj2"] Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.059616 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.161089 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data-custom\") pod \"571961b2-ee38-4d10-a958-4157c3624ec2\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.161477 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data\") pod \"571961b2-ee38-4d10-a958-4157c3624ec2\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.161646 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xw4k\" (UniqueName: \"kubernetes.io/projected/571961b2-ee38-4d10-a958-4157c3624ec2-kube-api-access-9xw4k\") pod \"571961b2-ee38-4d10-a958-4157c3624ec2\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.161732 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-combined-ca-bundle\") pod \"571961b2-ee38-4d10-a958-4157c3624ec2\" (UID: \"571961b2-ee38-4d10-a958-4157c3624ec2\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.192201 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "571961b2-ee38-4d10-a958-4157c3624ec2" (UID: "571961b2-ee38-4d10-a958-4157c3624ec2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.256683 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571961b2-ee38-4d10-a958-4157c3624ec2-kube-api-access-9xw4k" (OuterVolumeSpecName: "kube-api-access-9xw4k") pod "571961b2-ee38-4d10-a958-4157c3624ec2" (UID: "571961b2-ee38-4d10-a958-4157c3624ec2"). InnerVolumeSpecName "kube-api-access-9xw4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.264099 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.271527 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.271617 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xw4k\" (UniqueName: \"kubernetes.io/projected/571961b2-ee38-4d10-a958-4157c3624ec2-kube-api-access-9xw4k\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.300792 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "571961b2-ee38-4d10-a958-4157c3624ec2" (UID: "571961b2-ee38-4d10-a958-4157c3624ec2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.373265 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc6lg\" (UniqueName: \"kubernetes.io/projected/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-kube-api-access-bc6lg\") pod \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.373491 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-operator-scripts\") pod \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\" (UID: \"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.374127 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" (UID: "d5dc99b5-2935-4a73-aad4-ac3f687e6c9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.374626 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.374691 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.383234 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data" (OuterVolumeSpecName: "config-data") pod "571961b2-ee38-4d10-a958-4157c3624ec2" (UID: "571961b2-ee38-4d10-a958-4157c3624ec2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.394190 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.394832 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-kube-api-access-bc6lg" (OuterVolumeSpecName: "kube-api-access-bc6lg") pod "d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" (UID: "d5dc99b5-2935-4a73-aad4-ac3f687e6c9c"). InnerVolumeSpecName "kube-api-access-bc6lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.424460 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.476550 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31c479ac-ec69-4792-848a-20a6e6e92ee1-operator-scripts\") pod \"31c479ac-ec69-4792-848a-20a6e6e92ee1\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.476680 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9cjf\" (UniqueName: \"kubernetes.io/projected/31c479ac-ec69-4792-848a-20a6e6e92ee1-kube-api-access-r9cjf\") pod \"31c479ac-ec69-4792-848a-20a6e6e92ee1\" (UID: \"31c479ac-ec69-4792-848a-20a6e6e92ee1\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.476733 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-operator-scripts\") pod \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.476988 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxt2n\" (UniqueName: \"kubernetes.io/projected/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-kube-api-access-bxt2n\") pod \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\" (UID: \"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.477778 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571961b2-ee38-4d10-a958-4157c3624ec2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.477797 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc6lg\" (UniqueName: \"kubernetes.io/projected/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c-kube-api-access-bc6lg\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.479172 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31c479ac-ec69-4792-848a-20a6e6e92ee1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31c479ac-ec69-4792-848a-20a6e6e92ee1" (UID: "31c479ac-ec69-4792-848a-20a6e6e92ee1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.480589 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" (UID: "1dc4adbd-6bd0-4092-9a3f-59d17a09cb86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.487083 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c479ac-ec69-4792-848a-20a6e6e92ee1-kube-api-access-r9cjf" (OuterVolumeSpecName: "kube-api-access-r9cjf") pod "31c479ac-ec69-4792-848a-20a6e6e92ee1" (UID: "31c479ac-ec69-4792-848a-20a6e6e92ee1"). InnerVolumeSpecName "kube-api-access-r9cjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.487598 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.487945 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-kube-api-access-bxt2n" (OuterVolumeSpecName: "kube-api-access-bxt2n") pod "1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" (UID: "1dc4adbd-6bd0-4092-9a3f-59d17a09cb86"). InnerVolumeSpecName "kube-api-access-bxt2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.502166 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8656-account-create-update-4cfz9" event={"ID":"1dc4adbd-6bd0-4092-9a3f-59d17a09cb86","Type":"ContainerDied","Data":"c4958850fab7b0d83a03312d447d9321d486218ea5d7fd5ed7b5595845d3dd2a"} Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.502202 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4958850fab7b0d83a03312d447d9321d486218ea5d7fd5ed7b5595845d3dd2a" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.502278 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8656-account-create-update-4cfz9" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.513530 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65db596798-kdkj2" event={"ID":"1c0ed8c1-9098-4598-8390-ed8c709fa057","Type":"ContainerDied","Data":"ac74afb29eb7d4abd6c00e633ee31340c125187991368bf65861950ed9709b3a"} Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.513647 4626 scope.go:117] "RemoveContainer" containerID="2ae0c506b97fe511c8b662a87402ae3a36e67687f2e8f672b90b6cca99dc3cd3" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.513820 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65db596798-kdkj2" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.526819 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-675458dc64-5m4q9" event={"ID":"571961b2-ee38-4d10-a958-4157c3624ec2","Type":"ContainerDied","Data":"b7398a6e49aa4cf2a4181147d3522055a0deef3c5e0ab215fe39a3a99346c830"} Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.526896 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-675458dc64-5m4q9" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.545886 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e978-account-create-update-5t98k" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.545924 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e978-account-create-update-5t98k" event={"ID":"d5dc99b5-2935-4a73-aad4-ac3f687e6c9c","Type":"ContainerDied","Data":"a1017b4b6b7e5c24963be098658a9e7117a576bc5be6656e9c9023fb4b329c78"} Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.546716 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1017b4b6b7e5c24963be098658a9e7117a576bc5be6656e9c9023fb4b329c78" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.555010 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.556539 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9b56-account-create-update-gxhrg" event={"ID":"31c479ac-ec69-4792-848a-20a6e6e92ee1","Type":"ContainerDied","Data":"af9546450d864c919a88154c80ff6d73de826ef55287d04818c000cfafca1373"} Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.556590 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9546450d864c919a88154c80ff6d73de826ef55287d04818c000cfafca1373" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.564639 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2fc61e5-419c-4dab-9ddb-52bb9de855d5","Type":"ContainerStarted","Data":"782625c317044d990faffa950aa28d50cad449a00e07b432af6bde9b6b824658"} Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.565800 4626 scope.go:117] "RemoveContainer" containerID="35fa4a011f3cab1a6817a10b32cfc7e927de30733ca30319a4b41c7dead7b6b0" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.579321 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbjx8\" (UniqueName: \"kubernetes.io/projected/1c0ed8c1-9098-4598-8390-ed8c709fa057-kube-api-access-pbjx8\") pod \"1c0ed8c1-9098-4598-8390-ed8c709fa057\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.579408 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data\") pod \"1c0ed8c1-9098-4598-8390-ed8c709fa057\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.579469 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data-custom\") pod \"1c0ed8c1-9098-4598-8390-ed8c709fa057\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.579587 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-combined-ca-bundle\") pod \"1c0ed8c1-9098-4598-8390-ed8c709fa057\" (UID: \"1c0ed8c1-9098-4598-8390-ed8c709fa057\") " Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.580266 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.580287 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxt2n\" (UniqueName: \"kubernetes.io/projected/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86-kube-api-access-bxt2n\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.580297 4626 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31c479ac-ec69-4792-848a-20a6e6e92ee1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.580307 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9cjf\" (UniqueName: \"kubernetes.io/projected/31c479ac-ec69-4792-848a-20a6e6e92ee1-kube-api-access-r9cjf\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.594557 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0ed8c1-9098-4598-8390-ed8c709fa057-kube-api-access-pbjx8" (OuterVolumeSpecName: "kube-api-access-pbjx8") pod "1c0ed8c1-9098-4598-8390-ed8c709fa057" (UID: "1c0ed8c1-9098-4598-8390-ed8c709fa057"). InnerVolumeSpecName "kube-api-access-pbjx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.598940 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c0ed8c1-9098-4598-8390-ed8c709fa057" (UID: "1c0ed8c1-9098-4598-8390-ed8c709fa057"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.610755 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.610737825 podStartE2EDuration="4.610737825s" podCreationTimestamp="2026-02-23 07:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:00:58.591778026 +0000 UTC m=+1210.931107292" watchObservedRunningTime="2026-02-23 07:00:58.610737825 +0000 UTC m=+1210.950067090" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.628989 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-675458dc64-5m4q9"] Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.638302 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-675458dc64-5m4q9"] Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.686232 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbjx8\" (UniqueName: \"kubernetes.io/projected/1c0ed8c1-9098-4598-8390-ed8c709fa057-kube-api-access-pbjx8\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.686609 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.688605 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data" (OuterVolumeSpecName: "config-data") pod "1c0ed8c1-9098-4598-8390-ed8c709fa057" (UID: "1c0ed8c1-9098-4598-8390-ed8c709fa057"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.712068 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c0ed8c1-9098-4598-8390-ed8c709fa057" (UID: "1c0ed8c1-9098-4598-8390-ed8c709fa057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.795123 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.795154 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0ed8c1-9098-4598-8390-ed8c709fa057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.917047 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-65db596798-kdkj2"] Feb 23 07:00:58 crc kubenswrapper[4626]: I0223 07:00:58.924745 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-65db596798-kdkj2"] Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.192789 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.194149 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.230348 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.238958 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.580175 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.580806 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.971693 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.971991 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-central-agent" containerID="cri-o://87a2d96e8889ccec5a47a2499e0061f8ca3ab378201ef69903e153954cd8033c" gracePeriod=30 Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.972064 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="proxy-httpd" containerID="cri-o://4a72a2626560273a8c17060205a4b92de7f8e7aa4470ca12686eb935d791dc8c" gracePeriod=30 Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.972077 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="sg-core" containerID="cri-o://ba7421a8b00f3e04a25123b3aa361763691859e236695fe28a75881e196e0b0d" gracePeriod=30 Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.972273 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-notification-agent" containerID="cri-o://4df5fa9500589425421a100d8b1872a109d6923e6e636cd46219731c2af54a0b" gracePeriod=30 Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.992716 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" path="/var/lib/kubelet/pods/1c0ed8c1-9098-4598-8390-ed8c709fa057/volumes" Feb 23 07:00:59 crc kubenswrapper[4626]: I0223 07:00:59.999667 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" path="/var/lib/kubelet/pods/571961b2-ee38-4d10-a958-4157c3624ec2/volumes" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.134857 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530501-tbhcl"] Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139539 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e37d982-ad74-4cac-bd65-fc40c41f4dc5" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139568 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e37d982-ad74-4cac-bd65-fc40c41f4dc5" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139591 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139596 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139606 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139611 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139636 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139641 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139650 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c479ac-ec69-4792-848a-20a6e6e92ee1" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139655 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c479ac-ec69-4792-848a-20a6e6e92ee1" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139662 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139667 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139679 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139683 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139697 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139702 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139710 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139714 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139724 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139728 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139734 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139741 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: E0223 07:01:00.139752 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45aedbbc-efbd-4bf7-bbdb-f36992267beb" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139757 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="45aedbbc-efbd-4bf7-bbdb-f36992267beb" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.139993 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140002 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140009 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140015 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140034 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="45aedbbc-efbd-4bf7-bbdb-f36992267beb" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140043 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140052 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140057 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140066 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c479ac-ec69-4792-848a-20a6e6e92ee1" containerName="mariadb-account-create-update" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140077 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e37d982-ad74-4cac-bd65-fc40c41f4dc5" containerName="mariadb-database-create" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.140727 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.154429 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530501-tbhcl"] Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.278268 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-combined-ca-bundle\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.278562 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-fernet-keys\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.278677 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5pd\" (UniqueName: \"kubernetes.io/projected/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-kube-api-access-cd5pd\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.278724 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-config-data\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.381328 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-combined-ca-bundle\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.381634 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-fernet-keys\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.383256 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5pd\" (UniqueName: \"kubernetes.io/projected/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-kube-api-access-cd5pd\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.383404 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-config-data\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.391395 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-fernet-keys\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.392778 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-config-data\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.403998 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-combined-ca-bundle\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.424726 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5pd\" (UniqueName: \"kubernetes.io/projected/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-kube-api-access-cd5pd\") pod \"keystone-cron-29530501-tbhcl\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.474361 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.618936 4626 generic.go:334] "Generic (PLEG): container finished" podID="a57d585c-03c6-4731-8271-2398c0d774ad" containerID="4a72a2626560273a8c17060205a4b92de7f8e7aa4470ca12686eb935d791dc8c" exitCode=0 Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.619018 4626 generic.go:334] "Generic (PLEG): container finished" podID="a57d585c-03c6-4731-8271-2398c0d774ad" containerID="ba7421a8b00f3e04a25123b3aa361763691859e236695fe28a75881e196e0b0d" exitCode=2 Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.619948 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerDied","Data":"4a72a2626560273a8c17060205a4b92de7f8e7aa4470ca12686eb935d791dc8c"} Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.620024 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerDied","Data":"ba7421a8b00f3e04a25123b3aa361763691859e236695fe28a75881e196e0b0d"} Feb 23 07:01:00 crc kubenswrapper[4626]: I0223 07:01:00.902640 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530501-tbhcl"] Feb 23 07:01:01 crc kubenswrapper[4626]: I0223 07:01:01.630283 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530501-tbhcl" event={"ID":"4b61fdde-2209-4bd7-b3c1-a8f4123825a1","Type":"ContainerStarted","Data":"f1368d5f236351434fe1b4d9fed5bab00e947c72bbfef9ffaf544ff74addd0eb"} Feb 23 07:01:01 crc kubenswrapper[4626]: I0223 07:01:01.630813 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530501-tbhcl" event={"ID":"4b61fdde-2209-4bd7-b3c1-a8f4123825a1","Type":"ContainerStarted","Data":"09d7ab9cce8bf206f397dbcbd1ab0d4869f3e431651e5681ccc23346e3739a16"} Feb 23 07:01:01 crc kubenswrapper[4626]: I0223 07:01:01.637780 4626 generic.go:334] "Generic (PLEG): container finished" podID="a57d585c-03c6-4731-8271-2398c0d774ad" containerID="4df5fa9500589425421a100d8b1872a109d6923e6e636cd46219731c2af54a0b" exitCode=0 Feb 23 07:01:01 crc kubenswrapper[4626]: I0223 07:01:01.637991 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:01:01 crc kubenswrapper[4626]: I0223 07:01:01.638069 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:01:01 crc kubenswrapper[4626]: I0223 07:01:01.639179 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerDied","Data":"4df5fa9500589425421a100d8b1872a109d6923e6e636cd46219731c2af54a0b"} Feb 23 07:01:02 crc kubenswrapper[4626]: I0223 07:01:02.395740 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="a5f9731f-2161-4757-97a7-e542f744362c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.184:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:01:02 crc kubenswrapper[4626]: I0223 07:01:02.443117 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 07:01:02 crc kubenswrapper[4626]: I0223 07:01:02.467682 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29530501-tbhcl" podStartSLOduration=2.467660281 podStartE2EDuration="2.467660281s" podCreationTimestamp="2026-02-23 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:01.662907349 +0000 UTC m=+1214.002236616" watchObservedRunningTime="2026-02-23 07:01:02.467660281 +0000 UTC m=+1214.806989547" Feb 23 07:01:02 crc kubenswrapper[4626]: I0223 07:01:02.689339 4626 generic.go:334] "Generic (PLEG): container finished" podID="a57d585c-03c6-4731-8271-2398c0d774ad" containerID="87a2d96e8889ccec5a47a2499e0061f8ca3ab378201ef69903e153954cd8033c" exitCode=0 Feb 23 07:01:02 crc kubenswrapper[4626]: I0223 07:01:02.690904 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerDied","Data":"87a2d96e8889ccec5a47a2499e0061f8ca3ab378201ef69903e153954cd8033c"} Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.294204 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.317428 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-w65pk"] Feb 23 07:01:03 crc kubenswrapper[4626]: E0223 07:01:03.317892 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="proxy-httpd" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.317906 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="proxy-httpd" Feb 23 07:01:03 crc kubenswrapper[4626]: E0223 07:01:03.317915 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-central-agent" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.317921 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-central-agent" Feb 23 07:01:03 crc kubenswrapper[4626]: E0223 07:01:03.317948 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="sg-core" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.317955 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="sg-core" Feb 23 07:01:03 crc kubenswrapper[4626]: E0223 07:01:03.317978 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-notification-agent" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.317984 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-notification-agent" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318159 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-notification-agent" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318169 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="sg-core" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318185 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="ceilometer-central-agent" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318198 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" containerName="proxy-httpd" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318207 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0ed8c1-9098-4598-8390-ed8c709fa057" containerName="heat-cfnapi" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318217 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="571961b2-ee38-4d10-a958-4157c3624ec2" containerName="heat-api" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.318916 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.325539 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.325854 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mhb5q" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.326033 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.355428 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-w65pk"] Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.417653 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786k8\" (UniqueName: \"kubernetes.io/projected/a57d585c-03c6-4731-8271-2398c0d774ad-kube-api-access-786k8\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.417707 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-sg-core-conf-yaml\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.421118 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-scripts\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.421185 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-config-data\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.421241 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-log-httpd\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.421303 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-run-httpd\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.421395 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-combined-ca-bundle\") pod \"a57d585c-03c6-4731-8271-2398c0d774ad\" (UID: \"a57d585c-03c6-4731-8271-2398c0d774ad\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.422898 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dvvg\" (UniqueName: \"kubernetes.io/projected/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-kube-api-access-9dvvg\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.423077 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-config-data\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.423285 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-scripts\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.423325 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.429094 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.429353 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.442693 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-scripts" (OuterVolumeSpecName: "scripts") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.466911 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57d585c-03c6-4731-8271-2398c0d774ad-kube-api-access-786k8" (OuterVolumeSpecName: "kube-api-access-786k8") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "kube-api-access-786k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.481025 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527115 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-scripts\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527179 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527231 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dvvg\" (UniqueName: \"kubernetes.io/projected/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-kube-api-access-9dvvg\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527361 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-config-data\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527470 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527485 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a57d585c-03c6-4731-8271-2398c0d774ad-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527511 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786k8\" (UniqueName: \"kubernetes.io/projected/a57d585c-03c6-4731-8271-2398c0d774ad-kube-api-access-786k8\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527523 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.527531 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.539202 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-scripts\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.539656 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.568822 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.570164 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dvvg\" (UniqueName: \"kubernetes.io/projected/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-kube-api-access-9dvvg\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.570356 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-config-data\") pod \"nova-cell0-conductor-db-sync-w65pk\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.621924 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.635449 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.643881 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745470 4626 generic.go:334] "Generic (PLEG): container finished" podID="774d5101-24e2-4871-9a1a-f136698cf092" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" exitCode=0 Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745547 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-655f59485b-t5d4q" event={"ID":"774d5101-24e2-4871-9a1a-f136698cf092","Type":"ContainerDied","Data":"04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1"} Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745569 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s95rg\" (UniqueName: \"kubernetes.io/projected/774d5101-24e2-4871-9a1a-f136698cf092-kube-api-access-s95rg\") pod \"774d5101-24e2-4871-9a1a-f136698cf092\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745592 4626 scope.go:117] "RemoveContainer" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745605 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data\") pod \"774d5101-24e2-4871-9a1a-f136698cf092\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745666 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-combined-ca-bundle\") pod \"774d5101-24e2-4871-9a1a-f136698cf092\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745691 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data-custom\") pod \"774d5101-24e2-4871-9a1a-f136698cf092\" (UID: \"774d5101-24e2-4871-9a1a-f136698cf092\") " Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745715 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-655f59485b-t5d4q" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.745581 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-655f59485b-t5d4q" event={"ID":"774d5101-24e2-4871-9a1a-f136698cf092","Type":"ContainerDied","Data":"0a5a5f0e358fa87b1dfb3b7e05f0b089649bea7a19f9cf36580896124ce250a9"} Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.770909 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "774d5101-24e2-4871-9a1a-f136698cf092" (UID: "774d5101-24e2-4871-9a1a-f136698cf092"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.771045 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774d5101-24e2-4871-9a1a-f136698cf092-kube-api-access-s95rg" (OuterVolumeSpecName: "kube-api-access-s95rg") pod "774d5101-24e2-4871-9a1a-f136698cf092" (UID: "774d5101-24e2-4871-9a1a-f136698cf092"). InnerVolumeSpecName "kube-api-access-s95rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.771489 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-config-data" (OuterVolumeSpecName: "config-data") pod "a57d585c-03c6-4731-8271-2398c0d774ad" (UID: "a57d585c-03c6-4731-8271-2398c0d774ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.775958 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a57d585c-03c6-4731-8271-2398c0d774ad","Type":"ContainerDied","Data":"5d24ab5edec714694b5fe2107c61a1029fe66f11e38cd4436db8e0eb2ef49cba"} Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.776061 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.846311 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data" (OuterVolumeSpecName: "config-data") pod "774d5101-24e2-4871-9a1a-f136698cf092" (UID: "774d5101-24e2-4871-9a1a-f136698cf092"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.848326 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a57d585c-03c6-4731-8271-2398c0d774ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.851266 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s95rg\" (UniqueName: \"kubernetes.io/projected/774d5101-24e2-4871-9a1a-f136698cf092-kube-api-access-s95rg\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.851339 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.851395 4626 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.872392 4626 scope.go:117] "RemoveContainer" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.877523 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.888563 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:03 crc kubenswrapper[4626]: E0223 07:01:03.890375 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1\": container with ID starting with 04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1 not found: ID does not exist" containerID="04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.890417 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1"} err="failed to get container status \"04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1\": rpc error: code = NotFound desc = could not find container \"04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1\": container with ID starting with 04a5acf461f499da4058ae13bf729591e417650c8f685fdbd8c1d7b8af775ff1 not found: ID does not exist" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.890450 4626 scope.go:117] "RemoveContainer" containerID="4a72a2626560273a8c17060205a4b92de7f8e7aa4470ca12686eb935d791dc8c" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.907547 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:03 crc kubenswrapper[4626]: E0223 07:01:03.907976 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774d5101-24e2-4871-9a1a-f136698cf092" containerName="heat-engine" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.907989 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="774d5101-24e2-4871-9a1a-f136698cf092" containerName="heat-engine" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.908163 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="774d5101-24e2-4871-9a1a-f136698cf092" containerName="heat-engine" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.909705 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.914214 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.914438 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.917109 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953050 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-scripts\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953079 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-log-httpd\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953103 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953176 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-config-data\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953253 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953276 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4zg\" (UniqueName: \"kubernetes.io/projected/07e4b358-a18c-4a0d-9b68-efa68781c93e-kube-api-access-rz4zg\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.953296 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-run-httpd\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.954224 4626 scope.go:117] "RemoveContainer" containerID="ba7421a8b00f3e04a25123b3aa361763691859e236695fe28a75881e196e0b0d" Feb 23 07:01:03 crc kubenswrapper[4626]: I0223 07:01:03.958002 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "774d5101-24e2-4871-9a1a-f136698cf092" (UID: "774d5101-24e2-4871-9a1a-f136698cf092"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.021098 4626 scope.go:117] "RemoveContainer" containerID="4df5fa9500589425421a100d8b1872a109d6923e6e636cd46219731c2af54a0b" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.027811 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57d585c-03c6-4731-8271-2398c0d774ad" path="/var/lib/kubelet/pods/a57d585c-03c6-4731-8271-2398c0d774ad/volumes" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.067692 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-config-data\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.067873 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.067905 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4zg\" (UniqueName: \"kubernetes.io/projected/07e4b358-a18c-4a0d-9b68-efa68781c93e-kube-api-access-rz4zg\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.067929 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-run-httpd\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.069128 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-scripts\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.069163 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-log-httpd\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.069185 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.071836 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-run-httpd\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.072223 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-log-httpd\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.074401 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-scripts\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.076741 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774d5101-24e2-4871-9a1a-f136698cf092-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.078968 4626 scope.go:117] "RemoveContainer" containerID="87a2d96e8889ccec5a47a2499e0061f8ca3ab378201ef69903e153954cd8033c" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.094550 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4zg\" (UniqueName: \"kubernetes.io/projected/07e4b358-a18c-4a0d-9b68-efa68781c93e-kube-api-access-rz4zg\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.095250 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.116654 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-config-data\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.133138 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.147709 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-655f59485b-t5d4q"] Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.154756 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-655f59485b-t5d4q"] Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.251673 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.275049 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-w65pk"] Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.336693 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.336974 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.348659 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.777418 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.805426 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-w65pk" event={"ID":"3fd63778-7b2b-4377-b4c3-62b2c15d17e5","Type":"ContainerStarted","Data":"4d35443fc563401cb55f31a0d050fca291330661c5b16a9e85a8116b6517085e"} Feb 23 07:01:04 crc kubenswrapper[4626]: W0223 07:01:04.836891 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07e4b358_a18c_4a0d_9b68_efa68781c93e.slice/crio-6d6ba2401ebb0ee2de9231c79e0d5f70084d7abebdae7761a223ce81ccb9562f WatchSource:0}: Error finding container 6d6ba2401ebb0ee2de9231c79e0d5f70084d7abebdae7761a223ce81ccb9562f: Status 404 returned error can't find the container with id 6d6ba2401ebb0ee2de9231c79e0d5f70084d7abebdae7761a223ce81ccb9562f Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.840791 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.841758 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.917130 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:01:04 crc kubenswrapper[4626]: I0223 07:01:04.928653 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:01:05 crc kubenswrapper[4626]: I0223 07:01:05.857963 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerStarted","Data":"6d6ba2401ebb0ee2de9231c79e0d5f70084d7abebdae7761a223ce81ccb9562f"} Feb 23 07:01:05 crc kubenswrapper[4626]: I0223 07:01:05.858641 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:01:05 crc kubenswrapper[4626]: I0223 07:01:05.858721 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:01:05 crc kubenswrapper[4626]: I0223 07:01:05.992481 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774d5101-24e2-4871-9a1a-f136698cf092" path="/var/lib/kubelet/pods/774d5101-24e2-4871-9a1a-f136698cf092/volumes" Feb 23 07:01:06 crc kubenswrapper[4626]: I0223 07:01:06.873834 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerStarted","Data":"6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81"} Feb 23 07:01:06 crc kubenswrapper[4626]: I0223 07:01:06.874244 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerStarted","Data":"cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38"} Feb 23 07:01:06 crc kubenswrapper[4626]: I0223 07:01:06.880444 4626 generic.go:334] "Generic (PLEG): container finished" podID="4b61fdde-2209-4bd7-b3c1-a8f4123825a1" containerID="f1368d5f236351434fe1b4d9fed5bab00e947c72bbfef9ffaf544ff74addd0eb" exitCode=0 Feb 23 07:01:06 crc kubenswrapper[4626]: I0223 07:01:06.881804 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530501-tbhcl" event={"ID":"4b61fdde-2209-4bd7-b3c1-a8f4123825a1","Type":"ContainerDied","Data":"f1368d5f236351434fe1b4d9fed5bab00e947c72bbfef9ffaf544ff74addd0eb"} Feb 23 07:01:07 crc kubenswrapper[4626]: I0223 07:01:07.905740 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerStarted","Data":"0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff"} Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.425207 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.520531 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-fernet-keys\") pod \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.520607 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-config-data\") pod \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.520884 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-combined-ca-bundle\") pod \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.520912 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd5pd\" (UniqueName: \"kubernetes.io/projected/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-kube-api-access-cd5pd\") pod \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\" (UID: \"4b61fdde-2209-4bd7-b3c1-a8f4123825a1\") " Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.548008 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b61fdde-2209-4bd7-b3c1-a8f4123825a1" (UID: "4b61fdde-2209-4bd7-b3c1-a8f4123825a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.550435 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-kube-api-access-cd5pd" (OuterVolumeSpecName: "kube-api-access-cd5pd") pod "4b61fdde-2209-4bd7-b3c1-a8f4123825a1" (UID: "4b61fdde-2209-4bd7-b3c1-a8f4123825a1"). InnerVolumeSpecName "kube-api-access-cd5pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.586185 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b61fdde-2209-4bd7-b3c1-a8f4123825a1" (UID: "4b61fdde-2209-4bd7-b3c1-a8f4123825a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.617590 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-config-data" (OuterVolumeSpecName: "config-data") pod "4b61fdde-2209-4bd7-b3c1-a8f4123825a1" (UID: "4b61fdde-2209-4bd7-b3c1-a8f4123825a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.623746 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.623776 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.623789 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd5pd\" (UniqueName: \"kubernetes.io/projected/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-kube-api-access-cd5pd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.623798 4626 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b61fdde-2209-4bd7-b3c1-a8f4123825a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.932711 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530501-tbhcl" event={"ID":"4b61fdde-2209-4bd7-b3c1-a8f4123825a1","Type":"ContainerDied","Data":"09d7ab9cce8bf206f397dbcbd1ab0d4869f3e431651e5681ccc23346e3739a16"} Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.933886 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d7ab9cce8bf206f397dbcbd1ab0d4869f3e431651e5681ccc23346e3739a16" Feb 23 07:01:08 crc kubenswrapper[4626]: I0223 07:01:08.933633 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530501-tbhcl" Feb 23 07:01:09 crc kubenswrapper[4626]: I0223 07:01:09.897869 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:01:09 crc kubenswrapper[4626]: I0223 07:01:09.898024 4626 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:01:09 crc kubenswrapper[4626]: I0223 07:01:09.971283 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerStarted","Data":"afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef"} Feb 23 07:01:09 crc kubenswrapper[4626]: I0223 07:01:09.973689 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:01:09 crc kubenswrapper[4626]: I0223 07:01:09.982425 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:01:10 crc kubenswrapper[4626]: I0223 07:01:10.019732 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.043698704 podStartE2EDuration="7.019710625s" podCreationTimestamp="2026-02-23 07:01:03 +0000 UTC" firstStartedPulling="2026-02-23 07:01:04.849280547 +0000 UTC m=+1217.188609812" lastFinishedPulling="2026-02-23 07:01:08.825292467 +0000 UTC m=+1221.164621733" observedRunningTime="2026-02-23 07:01:09.996295417 +0000 UTC m=+1222.335624684" watchObservedRunningTime="2026-02-23 07:01:10.019710625 +0000 UTC m=+1222.359039891" Feb 23 07:01:14 crc kubenswrapper[4626]: I0223 07:01:14.740005 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:14 crc kubenswrapper[4626]: I0223 07:01:14.742968 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-central-agent" containerID="cri-o://cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38" gracePeriod=30 Feb 23 07:01:14 crc kubenswrapper[4626]: I0223 07:01:14.743457 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="proxy-httpd" containerID="cri-o://afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef" gracePeriod=30 Feb 23 07:01:14 crc kubenswrapper[4626]: I0223 07:01:14.743525 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="sg-core" containerID="cri-o://0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff" gracePeriod=30 Feb 23 07:01:14 crc kubenswrapper[4626]: I0223 07:01:14.743565 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-notification-agent" containerID="cri-o://6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81" gracePeriod=30 Feb 23 07:01:15 crc kubenswrapper[4626]: I0223 07:01:15.067258 4626 generic.go:334] "Generic (PLEG): container finished" podID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerID="afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef" exitCode=0 Feb 23 07:01:15 crc kubenswrapper[4626]: I0223 07:01:15.067288 4626 generic.go:334] "Generic (PLEG): container finished" podID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerID="0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff" exitCode=2 Feb 23 07:01:15 crc kubenswrapper[4626]: I0223 07:01:15.067316 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerDied","Data":"afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef"} Feb 23 07:01:15 crc kubenswrapper[4626]: I0223 07:01:15.067382 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerDied","Data":"0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff"} Feb 23 07:01:16 crc kubenswrapper[4626]: I0223 07:01:16.079476 4626 generic.go:334] "Generic (PLEG): container finished" podID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerID="6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81" exitCode=0 Feb 23 07:01:16 crc kubenswrapper[4626]: I0223 07:01:16.079527 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerDied","Data":"6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81"} Feb 23 07:01:16 crc kubenswrapper[4626]: I0223 07:01:16.803145 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5c9c23ae-e41d-48eb-8d54-2887f2b0e9de" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.171:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:01:19 crc kubenswrapper[4626]: I0223 07:01:19.162784 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-w65pk" event={"ID":"3fd63778-7b2b-4377-b4c3-62b2c15d17e5","Type":"ContainerStarted","Data":"de25619430f26966b0309fa56049de3497b6f1677868157e6ec74984e187cfed"} Feb 23 07:01:19 crc kubenswrapper[4626]: I0223 07:01:19.186890 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-w65pk" podStartSLOduration=2.34209646 podStartE2EDuration="16.186852462s" podCreationTimestamp="2026-02-23 07:01:03 +0000 UTC" firstStartedPulling="2026-02-23 07:01:04.285581488 +0000 UTC m=+1216.624910754" lastFinishedPulling="2026-02-23 07:01:18.13033749 +0000 UTC m=+1230.469666756" observedRunningTime="2026-02-23 07:01:19.177747209 +0000 UTC m=+1231.517076475" watchObservedRunningTime="2026-02-23 07:01:19.186852462 +0000 UTC m=+1231.526181728" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.835120 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921307 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-scripts\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921441 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-config-data\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921474 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-combined-ca-bundle\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921590 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-log-httpd\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921855 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz4zg\" (UniqueName: \"kubernetes.io/projected/07e4b358-a18c-4a0d-9b68-efa68781c93e-kube-api-access-rz4zg\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921894 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-sg-core-conf-yaml\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.921974 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-run-httpd\") pod \"07e4b358-a18c-4a0d-9b68-efa68781c93e\" (UID: \"07e4b358-a18c-4a0d-9b68-efa68781c93e\") " Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.922170 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.922456 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.922953 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.922973 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07e4b358-a18c-4a0d-9b68-efa68781c93e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.928481 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-scripts" (OuterVolumeSpecName: "scripts") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.948884 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e4b358-a18c-4a0d-9b68-efa68781c93e-kube-api-access-rz4zg" (OuterVolumeSpecName: "kube-api-access-rz4zg") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "kube-api-access-rz4zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.963422 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:23 crc kubenswrapper[4626]: I0223 07:01:23.989397 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.025212 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-config-data" (OuterVolumeSpecName: "config-data") pod "07e4b358-a18c-4a0d-9b68-efa68781c93e" (UID: "07e4b358-a18c-4a0d-9b68-efa68781c93e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.026444 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz4zg\" (UniqueName: \"kubernetes.io/projected/07e4b358-a18c-4a0d-9b68-efa68781c93e-kube-api-access-rz4zg\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.026466 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.026478 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.026487 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.026515 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e4b358-a18c-4a0d-9b68-efa68781c93e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.212701 4626 generic.go:334] "Generic (PLEG): container finished" podID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerID="cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38" exitCode=0 Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.212768 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerDied","Data":"cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38"} Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.212809 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07e4b358-a18c-4a0d-9b68-efa68781c93e","Type":"ContainerDied","Data":"6d6ba2401ebb0ee2de9231c79e0d5f70084d7abebdae7761a223ce81ccb9562f"} Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.212835 4626 scope.go:117] "RemoveContainer" containerID="afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.213041 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.244777 4626 scope.go:117] "RemoveContainer" containerID="0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.260662 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.269057 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.275528 4626 scope.go:117] "RemoveContainer" containerID="6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.292300 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.292959 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b61fdde-2209-4bd7-b3c1-a8f4123825a1" containerName="keystone-cron" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.292983 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b61fdde-2209-4bd7-b3c1-a8f4123825a1" containerName="keystone-cron" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.292997 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="proxy-httpd" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293004 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="proxy-httpd" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.293016 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-central-agent" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293024 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-central-agent" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.293055 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="sg-core" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293061 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="sg-core" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.293069 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-notification-agent" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293076 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-notification-agent" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293296 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-central-agent" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293308 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="proxy-httpd" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293320 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b61fdde-2209-4bd7-b3c1-a8f4123825a1" containerName="keystone-cron" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293333 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="sg-core" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.293345 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" containerName="ceilometer-notification-agent" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.295140 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.298761 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.299793 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.334764 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-scripts\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.334834 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.334906 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-log-httpd\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.335019 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.335043 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6q85\" (UniqueName: \"kubernetes.io/projected/3e51989b-c11c-407c-a12f-dbada333bbee-kube-api-access-c6q85\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.335064 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-run-httpd\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.335095 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-config-data\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.337300 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.358171 4626 scope.go:117] "RemoveContainer" containerID="cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.397035 4626 scope.go:117] "RemoveContainer" containerID="afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.397826 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef\": container with ID starting with afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef not found: ID does not exist" containerID="afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.397859 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef"} err="failed to get container status \"afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef\": rpc error: code = NotFound desc = could not find container \"afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef\": container with ID starting with afb8be6971922de936e38998aff36efa89fc0e7e84a939526bf87adf7d6c96ef not found: ID does not exist" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.397887 4626 scope.go:117] "RemoveContainer" containerID="0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.398287 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff\": container with ID starting with 0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff not found: ID does not exist" containerID="0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.398312 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff"} err="failed to get container status \"0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff\": rpc error: code = NotFound desc = could not find container \"0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff\": container with ID starting with 0943ed09f7946192d04e45719bc1792999b06b2aadf6afb4b0949e3858b27dff not found: ID does not exist" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.398329 4626 scope.go:117] "RemoveContainer" containerID="6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.398757 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81\": container with ID starting with 6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81 not found: ID does not exist" containerID="6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.398791 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81"} err="failed to get container status \"6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81\": rpc error: code = NotFound desc = could not find container \"6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81\": container with ID starting with 6b1899232f3f907a3adeb192f40d096a63713b6ee53f66ac0a2e644b6b2f0c81 not found: ID does not exist" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.398815 4626 scope.go:117] "RemoveContainer" containerID="cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38" Feb 23 07:01:24 crc kubenswrapper[4626]: E0223 07:01:24.399400 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38\": container with ID starting with cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38 not found: ID does not exist" containerID="cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.399452 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38"} err="failed to get container status \"cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38\": rpc error: code = NotFound desc = could not find container \"cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38\": container with ID starting with cf3146ba250d4b33b666bcd0ac928a082849abd2d3fb77a847c5102b16abcb38 not found: ID does not exist" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436461 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-scripts\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436521 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436565 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-log-httpd\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436614 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436631 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6q85\" (UniqueName: \"kubernetes.io/projected/3e51989b-c11c-407c-a12f-dbada333bbee-kube-api-access-c6q85\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436647 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-run-httpd\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.436664 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-config-data\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.440936 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-log-httpd\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.441242 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-run-httpd\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.444396 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.446974 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.451099 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-scripts\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.462069 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-config-data\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.468050 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6q85\" (UniqueName: \"kubernetes.io/projected/3e51989b-c11c-407c-a12f-dbada333bbee-kube-api-access-c6q85\") pod \"ceilometer-0\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " pod="openstack/ceilometer-0" Feb 23 07:01:24 crc kubenswrapper[4626]: I0223 07:01:24.633113 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:25 crc kubenswrapper[4626]: I0223 07:01:25.290428 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:25 crc kubenswrapper[4626]: I0223 07:01:25.998046 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e4b358-a18c-4a0d-9b68-efa68781c93e" path="/var/lib/kubelet/pods/07e4b358-a18c-4a0d-9b68-efa68781c93e/volumes" Feb 23 07:01:26 crc kubenswrapper[4626]: I0223 07:01:26.232126 4626 generic.go:334] "Generic (PLEG): container finished" podID="3fd63778-7b2b-4377-b4c3-62b2c15d17e5" containerID="de25619430f26966b0309fa56049de3497b6f1677868157e6ec74984e187cfed" exitCode=0 Feb 23 07:01:26 crc kubenswrapper[4626]: I0223 07:01:26.232213 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-w65pk" event={"ID":"3fd63778-7b2b-4377-b4c3-62b2c15d17e5","Type":"ContainerDied","Data":"de25619430f26966b0309fa56049de3497b6f1677868157e6ec74984e187cfed"} Feb 23 07:01:26 crc kubenswrapper[4626]: I0223 07:01:26.233944 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerStarted","Data":"a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b"} Feb 23 07:01:26 crc kubenswrapper[4626]: I0223 07:01:26.233998 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerStarted","Data":"4181ae1fc5b9a5240f77c249df4af53a04da175db0eef2e8799dfc435bb6c445"} Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.244196 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerStarted","Data":"4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4"} Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.553041 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.618372 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dvvg\" (UniqueName: \"kubernetes.io/projected/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-kube-api-access-9dvvg\") pod \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.618412 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-combined-ca-bundle\") pod \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.618458 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-config-data\") pod \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.618576 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-scripts\") pod \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\" (UID: \"3fd63778-7b2b-4377-b4c3-62b2c15d17e5\") " Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.626574 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-scripts" (OuterVolumeSpecName: "scripts") pod "3fd63778-7b2b-4377-b4c3-62b2c15d17e5" (UID: "3fd63778-7b2b-4377-b4c3-62b2c15d17e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.626662 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-kube-api-access-9dvvg" (OuterVolumeSpecName: "kube-api-access-9dvvg") pod "3fd63778-7b2b-4377-b4c3-62b2c15d17e5" (UID: "3fd63778-7b2b-4377-b4c3-62b2c15d17e5"). InnerVolumeSpecName "kube-api-access-9dvvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.666297 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fd63778-7b2b-4377-b4c3-62b2c15d17e5" (UID: "3fd63778-7b2b-4377-b4c3-62b2c15d17e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.669044 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-config-data" (OuterVolumeSpecName: "config-data") pod "3fd63778-7b2b-4377-b4c3-62b2c15d17e5" (UID: "3fd63778-7b2b-4377-b4c3-62b2c15d17e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.721023 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.721046 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dvvg\" (UniqueName: \"kubernetes.io/projected/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-kube-api-access-9dvvg\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.721057 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:27 crc kubenswrapper[4626]: I0223 07:01:27.721066 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd63778-7b2b-4377-b4c3-62b2c15d17e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.277481 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerStarted","Data":"65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416"} Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.285538 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-w65pk" event={"ID":"3fd63778-7b2b-4377-b4c3-62b2c15d17e5","Type":"ContainerDied","Data":"4d35443fc563401cb55f31a0d050fca291330661c5b16a9e85a8116b6517085e"} Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.285604 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d35443fc563401cb55f31a0d050fca291330661c5b16a9e85a8116b6517085e" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.285645 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-w65pk" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.358746 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:01:28 crc kubenswrapper[4626]: E0223 07:01:28.359312 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd63778-7b2b-4377-b4c3-62b2c15d17e5" containerName="nova-cell0-conductor-db-sync" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.359333 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd63778-7b2b-4377-b4c3-62b2c15d17e5" containerName="nova-cell0-conductor-db-sync" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.359589 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd63778-7b2b-4377-b4c3-62b2c15d17e5" containerName="nova-cell0-conductor-db-sync" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.360285 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.366332 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mhb5q" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.369994 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.372183 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.536417 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6972135-5165-4e01-9a21-591d7c07c533-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.536513 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptbqf\" (UniqueName: \"kubernetes.io/projected/e6972135-5165-4e01-9a21-591d7c07c533-kube-api-access-ptbqf\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.536903 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6972135-5165-4e01-9a21-591d7c07c533-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.639141 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6972135-5165-4e01-9a21-591d7c07c533-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.639302 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6972135-5165-4e01-9a21-591d7c07c533-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.639356 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptbqf\" (UniqueName: \"kubernetes.io/projected/e6972135-5165-4e01-9a21-591d7c07c533-kube-api-access-ptbqf\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.647384 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6972135-5165-4e01-9a21-591d7c07c533-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.647717 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6972135-5165-4e01-9a21-591d7c07c533-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.656640 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptbqf\" (UniqueName: \"kubernetes.io/projected/e6972135-5165-4e01-9a21-591d7c07c533-kube-api-access-ptbqf\") pod \"nova-cell0-conductor-0\" (UID: \"e6972135-5165-4e01-9a21-591d7c07c533\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:28 crc kubenswrapper[4626]: I0223 07:01:28.693806 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:29 crc kubenswrapper[4626]: I0223 07:01:29.191061 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:01:29 crc kubenswrapper[4626]: I0223 07:01:29.303688 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e6972135-5165-4e01-9a21-591d7c07c533","Type":"ContainerStarted","Data":"474b1960da3272b2f7b5aae100a423d7e2751c6f7dc04fdc754ceb1bbdf4a75d"} Feb 23 07:01:30 crc kubenswrapper[4626]: I0223 07:01:30.318541 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerStarted","Data":"31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b"} Feb 23 07:01:30 crc kubenswrapper[4626]: I0223 07:01:30.319046 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:01:30 crc kubenswrapper[4626]: I0223 07:01:30.320979 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e6972135-5165-4e01-9a21-591d7c07c533","Type":"ContainerStarted","Data":"a82af9a34e487a2ebd9e913a4d166fefa87a02c9a4c2cd19f296cc76328bb696"} Feb 23 07:01:30 crc kubenswrapper[4626]: I0223 07:01:30.321104 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:30 crc kubenswrapper[4626]: I0223 07:01:30.364013 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.429419495 podStartE2EDuration="6.363993885s" podCreationTimestamp="2026-02-23 07:01:24 +0000 UTC" firstStartedPulling="2026-02-23 07:01:25.297547342 +0000 UTC m=+1237.636876607" lastFinishedPulling="2026-02-23 07:01:29.232121731 +0000 UTC m=+1241.571450997" observedRunningTime="2026-02-23 07:01:30.361005901 +0000 UTC m=+1242.700335167" watchObservedRunningTime="2026-02-23 07:01:30.363993885 +0000 UTC m=+1242.703323150" Feb 23 07:01:30 crc kubenswrapper[4626]: I0223 07:01:30.385858 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.385821407 podStartE2EDuration="2.385821407s" podCreationTimestamp="2026-02-23 07:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:30.375614473 +0000 UTC m=+1242.714943739" watchObservedRunningTime="2026-02-23 07:01:30.385821407 +0000 UTC m=+1242.725150693" Feb 23 07:01:38 crc kubenswrapper[4626]: I0223 07:01:38.721616 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.328941 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pbxnw"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.340426 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.353668 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.354042 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.357546 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pbxnw"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.427489 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npcf8\" (UniqueName: \"kubernetes.io/projected/2405cf95-00e4-40c0-bd99-266460b42580-kube-api-access-npcf8\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.427590 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.427628 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-scripts\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.428005 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-config-data\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.478245 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.480137 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.487565 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.489324 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.492011 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.492260 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.495760 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.505283 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535001 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66e0eb4d-4467-485f-9203-e4674b950a2b-logs\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535072 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc739604-b750-4eb0-ada8-55c64c50badd-logs\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535151 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-config-data\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535211 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-config-data\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535287 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-config-data\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535308 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xlt\" (UniqueName: \"kubernetes.io/projected/bc739604-b750-4eb0-ada8-55c64c50badd-kube-api-access-r5xlt\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535338 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npcf8\" (UniqueName: \"kubernetes.io/projected/2405cf95-00e4-40c0-bd99-266460b42580-kube-api-access-npcf8\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535385 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535414 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535445 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-scripts\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535473 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.535561 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4572\" (UniqueName: \"kubernetes.io/projected/66e0eb4d-4467-485f-9203-e4674b950a2b-kube-api-access-r4572\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.540576 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.541848 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.551370 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.552134 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-config-data\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.552470 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-scripts\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.565287 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.590975 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npcf8\" (UniqueName: \"kubernetes.io/projected/2405cf95-00e4-40c0-bd99-266460b42580-kube-api-access-npcf8\") pod \"nova-cell0-cell-mapping-pbxnw\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.609111 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642265 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4572\" (UniqueName: \"kubernetes.io/projected/66e0eb4d-4467-485f-9203-e4674b950a2b-kube-api-access-r4572\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642395 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66e0eb4d-4467-485f-9203-e4674b950a2b-logs\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642456 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc739604-b750-4eb0-ada8-55c64c50badd-logs\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642548 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-config-data\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642609 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-config-data\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642673 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xlt\" (UniqueName: \"kubernetes.io/projected/bc739604-b750-4eb0-ada8-55c64c50badd-kube-api-access-r5xlt\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642733 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.642765 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.643876 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc739604-b750-4eb0-ada8-55c64c50badd-logs\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.645870 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66e0eb4d-4467-485f-9203-e4674b950a2b-logs\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.654085 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-config-data\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.661420 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.662112 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.662680 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-config-data\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.715023 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xlt\" (UniqueName: \"kubernetes.io/projected/bc739604-b750-4eb0-ada8-55c64c50badd-kube-api-access-r5xlt\") pod \"nova-metadata-0\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.715095 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.715880 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4572\" (UniqueName: \"kubernetes.io/projected/66e0eb4d-4467-485f-9203-e4674b950a2b-kube-api-access-r4572\") pod \"nova-api-0\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.716599 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.722870 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.723534 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.729174 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.756126 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.756275 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-config-data\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.756446 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlrvp\" (UniqueName: \"kubernetes.io/projected/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-kube-api-access-tlrvp\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.804306 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.815155 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.822110 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5cc5d4d5-zchg2"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.827825 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.857946 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.858015 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlrvp\" (UniqueName: \"kubernetes.io/projected/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-kube-api-access-tlrvp\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.858198 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.858417 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.858529 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrl4\" (UniqueName: \"kubernetes.io/projected/d778a9bb-ee5d-4974-909f-3a16b3db565b-kube-api-access-xzrl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.858706 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-config-data\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.867292 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-config-data\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.867406 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.876141 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5cc5d4d5-zchg2"] Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.888828 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlrvp\" (UniqueName: \"kubernetes.io/projected/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-kube-api-access-tlrvp\") pod \"nova-scheduler-0\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962083 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962184 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrl4\" (UniqueName: \"kubernetes.io/projected/d778a9bb-ee5d-4974-909f-3a16b3db565b-kube-api-access-xzrl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962233 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962288 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-svc\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962343 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962362 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962379 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqsdm\" (UniqueName: \"kubernetes.io/projected/f0dbf362-e5f3-461b-8641-9e043f490538-kube-api-access-mqsdm\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962407 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-config\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.962424 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.973346 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.987832 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:39 crc kubenswrapper[4626]: I0223 07:01:39.997838 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrl4\" (UniqueName: \"kubernetes.io/projected/d778a9bb-ee5d-4974-909f-3a16b3db565b-kube-api-access-xzrl4\") pod \"nova-cell1-novncproxy-0\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.061373 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.067657 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.067719 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqsdm\" (UniqueName: \"kubernetes.io/projected/f0dbf362-e5f3-461b-8641-9e043f490538-kube-api-access-mqsdm\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.067755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-config\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.067778 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.067893 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.067970 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-svc\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.068699 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.069422 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-config\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.070354 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.070938 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.071081 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-svc\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.086959 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.094327 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqsdm\" (UniqueName: \"kubernetes.io/projected/f0dbf362-e5f3-461b-8641-9e043f490538-kube-api-access-mqsdm\") pod \"dnsmasq-dns-6d5cc5d4d5-zchg2\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.166260 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.382378 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pbxnw"] Feb 23 07:01:40 crc kubenswrapper[4626]: W0223 07:01:40.394836 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2405cf95_00e4_40c0_bd99_266460b42580.slice/crio-73bd6ae43992adc3f7d1a921d0de546788fbc24b5f681dcfc9a45459060d294f WatchSource:0}: Error finding container 73bd6ae43992adc3f7d1a921d0de546788fbc24b5f681dcfc9a45459060d294f: Status 404 returned error can't find the container with id 73bd6ae43992adc3f7d1a921d0de546788fbc24b5f681dcfc9a45459060d294f Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.495875 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.535374 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pbxnw" event={"ID":"2405cf95-00e4-40c0-bd99-266460b42580","Type":"ContainerStarted","Data":"73bd6ae43992adc3f7d1a921d0de546788fbc24b5f681dcfc9a45459060d294f"} Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.536947 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:40 crc kubenswrapper[4626]: W0223 07:01:40.544026 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66e0eb4d_4467_485f_9203_e4674b950a2b.slice/crio-5abf35ccf542474def5946becb75e89575df145775859a583353b4061ada79ad WatchSource:0}: Error finding container 5abf35ccf542474def5946becb75e89575df145775859a583353b4061ada79ad: Status 404 returned error can't find the container with id 5abf35ccf542474def5946becb75e89575df145775859a583353b4061ada79ad Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.756725 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.770938 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:40 crc kubenswrapper[4626]: I0223 07:01:40.997193 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5cc5d4d5-zchg2"] Feb 23 07:01:41 crc kubenswrapper[4626]: W0223 07:01:41.004559 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0dbf362_e5f3_461b_8641_9e043f490538.slice/crio-b7d30e5f5e9a803558680d905021c2279ba60dd389e7e37b538b5520c31cad2d WatchSource:0}: Error finding container b7d30e5f5e9a803558680d905021c2279ba60dd389e7e37b538b5520c31cad2d: Status 404 returned error can't find the container with id b7d30e5f5e9a803558680d905021c2279ba60dd389e7e37b538b5520c31cad2d Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.589844 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d778a9bb-ee5d-4974-909f-3a16b3db565b","Type":"ContainerStarted","Data":"c1e1d243c0cd34c4fab6bafe89a8c5dbcb8599c3c04c38b0301f0928ab0b7e0e"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.611640 4626 generic.go:334] "Generic (PLEG): container finished" podID="f0dbf362-e5f3-461b-8641-9e043f490538" containerID="705e17a56d88e62f86c0ed4ace4de233cf49be755d1689438abfd4da428c642c" exitCode=0 Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.612654 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" event={"ID":"f0dbf362-e5f3-461b-8641-9e043f490538","Type":"ContainerDied","Data":"705e17a56d88e62f86c0ed4ace4de233cf49be755d1689438abfd4da428c642c"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.612684 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" event={"ID":"f0dbf362-e5f3-461b-8641-9e043f490538","Type":"ContainerStarted","Data":"b7d30e5f5e9a803558680d905021c2279ba60dd389e7e37b538b5520c31cad2d"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.628894 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc739604-b750-4eb0-ada8-55c64c50badd","Type":"ContainerStarted","Data":"3e4401e0903637404b11ce6090be1cc7f9518fde6f358547a47c1de0d45298e9"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.684247 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pbxnw" event={"ID":"2405cf95-00e4-40c0-bd99-266460b42580","Type":"ContainerStarted","Data":"378992167abee806d660f94752873bd86bfb062ec2479f143a977125d0c4c411"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.706982 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66e0eb4d-4467-485f-9203-e4674b950a2b","Type":"ContainerStarted","Data":"5abf35ccf542474def5946becb75e89575df145775859a583353b4061ada79ad"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.721599 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2360f12-c181-43dc-81b0-cd6ed0ca05f2","Type":"ContainerStarted","Data":"19e58eb4f3bb1b0c707daa908f5c515b2ae92ee6e9dd5ea25d6ee570646bdeb4"} Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.730268 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pbxnw" podStartSLOduration=2.730252986 podStartE2EDuration="2.730252986s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:41.713193793 +0000 UTC m=+1254.052523060" watchObservedRunningTime="2026-02-23 07:01:41.730252986 +0000 UTC m=+1254.069582251" Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.948647 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-682fl"] Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.950212 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.952955 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 23 07:01:41 crc kubenswrapper[4626]: I0223 07:01:41.953384 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.027377 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-682fl"] Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.047061 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-config-data\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.047150 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-scripts\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.047387 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.047427 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxnj\" (UniqueName: \"kubernetes.io/projected/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-kube-api-access-5nxnj\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.149696 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-config-data\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.149801 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-scripts\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.150011 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.150052 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxnj\" (UniqueName: \"kubernetes.io/projected/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-kube-api-access-5nxnj\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.159768 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-scripts\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.162116 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-config-data\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.168915 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.174118 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxnj\" (UniqueName: \"kubernetes.io/projected/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-kube-api-access-5nxnj\") pod \"nova-cell1-conductor-db-sync-682fl\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.282815 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.741291 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" event={"ID":"f0dbf362-e5f3-461b-8641-9e043f490538","Type":"ContainerStarted","Data":"a51ba017194a177695e0376a2dddd9cba0707274b5d33308278f93cc1a93a616"} Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.741771 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.831702 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" podStartSLOduration=3.83168548 podStartE2EDuration="3.83168548s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:42.765278809 +0000 UTC m=+1255.104608095" watchObservedRunningTime="2026-02-23 07:01:42.83168548 +0000 UTC m=+1255.171014746" Feb 23 07:01:42 crc kubenswrapper[4626]: I0223 07:01:42.840643 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-682fl"] Feb 23 07:01:43 crc kubenswrapper[4626]: I0223 07:01:43.123626 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:01:43 crc kubenswrapper[4626]: I0223 07:01:43.142332 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.202971 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.203533 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-central-agent" containerID="cri-o://a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b" gracePeriod=30 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.203688 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="proxy-httpd" containerID="cri-o://31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b" gracePeriod=30 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.203732 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="sg-core" containerID="cri-o://65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416" gracePeriod=30 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.203770 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-notification-agent" containerID="cri-o://4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4" gracePeriod=30 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.223746 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.787995 4626 generic.go:334] "Generic (PLEG): container finished" podID="3e51989b-c11c-407c-a12f-dbada333bbee" containerID="31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b" exitCode=0 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.788030 4626 generic.go:334] "Generic (PLEG): container finished" podID="3e51989b-c11c-407c-a12f-dbada333bbee" containerID="65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416" exitCode=2 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.788040 4626 generic.go:334] "Generic (PLEG): container finished" podID="3e51989b-c11c-407c-a12f-dbada333bbee" containerID="a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b" exitCode=0 Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.788078 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerDied","Data":"31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b"} Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.788134 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerDied","Data":"65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416"} Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.788146 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerDied","Data":"a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b"} Feb 23 07:01:44 crc kubenswrapper[4626]: I0223 07:01:44.789825 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-682fl" event={"ID":"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35","Type":"ContainerStarted","Data":"08683409904bef95935dea04544df2839e75a36c90c8ad43a53b9afdbb7e1de1"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.800434 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2360f12-c181-43dc-81b0-cd6ed0ca05f2","Type":"ContainerStarted","Data":"3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.802309 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d778a9bb-ee5d-4974-909f-3a16b3db565b","Type":"ContainerStarted","Data":"e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.802414 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d778a9bb-ee5d-4974-909f-3a16b3db565b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc" gracePeriod=30 Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.805290 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc739604-b750-4eb0-ada8-55c64c50badd","Type":"ContainerStarted","Data":"611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.805329 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc739604-b750-4eb0-ada8-55c64c50badd","Type":"ContainerStarted","Data":"99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.805420 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-log" containerID="cri-o://99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da" gracePeriod=30 Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.805567 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-metadata" containerID="cri-o://611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f" gracePeriod=30 Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.811548 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-682fl" event={"ID":"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35","Type":"ContainerStarted","Data":"40d143df27f38c86086e8f336b531fdac2ea0921b36da45ef4f48a1637d3e67d"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.817214 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66e0eb4d-4467-485f-9203-e4674b950a2b","Type":"ContainerStarted","Data":"9851aebfa9b681137ce6eef38ab0ce44c7e3e32cdd89e62e9ef9267310d20216"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.817239 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66e0eb4d-4467-485f-9203-e4674b950a2b","Type":"ContainerStarted","Data":"974d19150e8e9faf830024e02086ef38bb2916626861bc70bb97ce8ffc4223c8"} Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.846883 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.634188399 podStartE2EDuration="6.84684935s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="2026-02-23 07:01:40.811297967 +0000 UTC m=+1253.150627233" lastFinishedPulling="2026-02-23 07:01:45.023958919 +0000 UTC m=+1257.363288184" observedRunningTime="2026-02-23 07:01:45.814827109 +0000 UTC m=+1258.154156374" watchObservedRunningTime="2026-02-23 07:01:45.84684935 +0000 UTC m=+1258.186178617" Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.858276 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.593232654 podStartE2EDuration="6.858259995s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="2026-02-23 07:01:40.757185215 +0000 UTC m=+1253.096514482" lastFinishedPulling="2026-02-23 07:01:45.022212556 +0000 UTC m=+1257.361541823" observedRunningTime="2026-02-23 07:01:45.841764604 +0000 UTC m=+1258.181093860" watchObservedRunningTime="2026-02-23 07:01:45.858259995 +0000 UTC m=+1258.197589260" Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.874051 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-682fl" podStartSLOduration=4.874035147 podStartE2EDuration="4.874035147s" podCreationTimestamp="2026-02-23 07:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:45.868773386 +0000 UTC m=+1258.208102642" watchObservedRunningTime="2026-02-23 07:01:45.874035147 +0000 UTC m=+1258.213364413" Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.899112 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.414624835 podStartE2EDuration="6.899088371s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="2026-02-23 07:01:40.537010397 +0000 UTC m=+1252.876339663" lastFinishedPulling="2026-02-23 07:01:45.021473933 +0000 UTC m=+1257.360803199" observedRunningTime="2026-02-23 07:01:45.883759881 +0000 UTC m=+1258.223089147" watchObservedRunningTime="2026-02-23 07:01:45.899088371 +0000 UTC m=+1258.238417636" Feb 23 07:01:45 crc kubenswrapper[4626]: I0223 07:01:45.993471 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.523066375 podStartE2EDuration="6.993434587s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="2026-02-23 07:01:40.56109814 +0000 UTC m=+1252.900427397" lastFinishedPulling="2026-02-23 07:01:45.031466353 +0000 UTC m=+1257.370795609" observedRunningTime="2026-02-23 07:01:45.903551315 +0000 UTC m=+1258.242880581" watchObservedRunningTime="2026-02-23 07:01:45.993434587 +0000 UTC m=+1258.332763843" Feb 23 07:01:46 crc kubenswrapper[4626]: I0223 07:01:46.827972 4626 generic.go:334] "Generic (PLEG): container finished" podID="bc739604-b750-4eb0-ada8-55c64c50badd" containerID="99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da" exitCode=143 Feb 23 07:01:46 crc kubenswrapper[4626]: I0223 07:01:46.828612 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc739604-b750-4eb0-ada8-55c64c50badd","Type":"ContainerDied","Data":"99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da"} Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.741709 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.852733 4626 generic.go:334] "Generic (PLEG): container finished" podID="3e51989b-c11c-407c-a12f-dbada333bbee" containerID="4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4" exitCode=0 Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.852791 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerDied","Data":"4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4"} Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.852842 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e51989b-c11c-407c-a12f-dbada333bbee","Type":"ContainerDied","Data":"4181ae1fc5b9a5240f77c249df4af53a04da175db0eef2e8799dfc435bb6c445"} Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.852882 4626 scope.go:117] "RemoveContainer" containerID="31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.852896 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.856684 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-run-httpd\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.856807 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-combined-ca-bundle\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.856988 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-config-data\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.857221 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-sg-core-conf-yaml\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.857288 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-scripts\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.857381 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-log-httpd\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.857809 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6q85\" (UniqueName: \"kubernetes.io/projected/3e51989b-c11c-407c-a12f-dbada333bbee-kube-api-access-c6q85\") pod \"3e51989b-c11c-407c-a12f-dbada333bbee\" (UID: \"3e51989b-c11c-407c-a12f-dbada333bbee\") " Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.860888 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.862582 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.870355 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-scripts" (OuterVolumeSpecName: "scripts") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.882803 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e51989b-c11c-407c-a12f-dbada333bbee-kube-api-access-c6q85" (OuterVolumeSpecName: "kube-api-access-c6q85") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "kube-api-access-c6q85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.889011 4626 scope.go:117] "RemoveContainer" containerID="65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.898202 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.945725 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.957925 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-config-data" (OuterVolumeSpecName: "config-data") pod "3e51989b-c11c-407c-a12f-dbada333bbee" (UID: "3e51989b-c11c-407c-a12f-dbada333bbee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961047 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961078 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961095 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961106 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6q85\" (UniqueName: \"kubernetes.io/projected/3e51989b-c11c-407c-a12f-dbada333bbee-kube-api-access-c6q85\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961117 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e51989b-c11c-407c-a12f-dbada333bbee-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961127 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.961136 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e51989b-c11c-407c-a12f-dbada333bbee-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.966742 4626 scope.go:117] "RemoveContainer" containerID="4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4" Feb 23 07:01:48 crc kubenswrapper[4626]: I0223 07:01:48.988754 4626 scope.go:117] "RemoveContainer" containerID="a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.020171 4626 scope.go:117] "RemoveContainer" containerID="31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.024268 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b\": container with ID starting with 31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b not found: ID does not exist" containerID="31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.024301 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b"} err="failed to get container status \"31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b\": rpc error: code = NotFound desc = could not find container \"31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b\": container with ID starting with 31eb3f1f1a53f0fa74c6aed28fae3ff74820f375f78009e491074df5c86a9b5b not found: ID does not exist" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.024324 4626 scope.go:117] "RemoveContainer" containerID="65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.025006 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416\": container with ID starting with 65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416 not found: ID does not exist" containerID="65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.025040 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416"} err="failed to get container status \"65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416\": rpc error: code = NotFound desc = could not find container \"65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416\": container with ID starting with 65bba1f0394b206ceb5f7b1d52a0583b6019ff2f42265a2dab6bea0be4943416 not found: ID does not exist" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.025064 4626 scope.go:117] "RemoveContainer" containerID="4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.025422 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4\": container with ID starting with 4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4 not found: ID does not exist" containerID="4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.025442 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4"} err="failed to get container status \"4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4\": rpc error: code = NotFound desc = could not find container \"4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4\": container with ID starting with 4976c2a38e6992afd7521597e218d4237887b60b993769bcbd8bbd441d091cf4 not found: ID does not exist" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.025455 4626 scope.go:117] "RemoveContainer" containerID="a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.025819 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b\": container with ID starting with a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b not found: ID does not exist" containerID="a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.025845 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b"} err="failed to get container status \"a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b\": rpc error: code = NotFound desc = could not find container \"a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b\": container with ID starting with a406537645977b70185a78778ff0015e6080d792249016f262d498c056d76c4b not found: ID does not exist" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.190722 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.209515 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.231559 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.232258 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-notification-agent" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232283 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-notification-agent" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.232313 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="proxy-httpd" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232320 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="proxy-httpd" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.232338 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="sg-core" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232346 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="sg-core" Feb 23 07:01:49 crc kubenswrapper[4626]: E0223 07:01:49.232356 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-central-agent" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232362 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-central-agent" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232717 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-notification-agent" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232737 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="sg-core" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232753 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="ceilometer-central-agent" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.232780 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" containerName="proxy-httpd" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.235518 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.239032 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.239310 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.242969 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369450 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369530 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-run-httpd\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369718 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-log-httpd\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369820 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gtf\" (UniqueName: \"kubernetes.io/projected/f993751e-c4d5-468d-bfb3-1a0d854926c6-kube-api-access-m2gtf\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369883 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369951 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-config-data\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.369989 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-scripts\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471144 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-run-httpd\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471246 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-log-httpd\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471309 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gtf\" (UniqueName: \"kubernetes.io/projected/f993751e-c4d5-468d-bfb3-1a0d854926c6-kube-api-access-m2gtf\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471380 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471403 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-config-data\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471559 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-scripts\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.471591 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.472943 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-run-httpd\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.473269 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-log-httpd\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.478387 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.478877 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-config-data\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.479523 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.481466 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-scripts\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.491318 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gtf\" (UniqueName: \"kubernetes.io/projected/f993751e-c4d5-468d-bfb3-1a0d854926c6-kube-api-access-m2gtf\") pod \"ceilometer-0\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.555035 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.806118 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.806411 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.815933 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.815968 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.866810 4626 generic.go:334] "Generic (PLEG): container finished" podID="2405cf95-00e4-40c0-bd99-266460b42580" containerID="378992167abee806d660f94752873bd86bfb062ec2479f143a977125d0c4c411" exitCode=0 Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.866901 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pbxnw" event={"ID":"2405cf95-00e4-40c0-bd99-266460b42580","Type":"ContainerDied","Data":"378992167abee806d660f94752873bd86bfb062ec2479f143a977125d0c4c411"} Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.868567 4626 generic.go:334] "Generic (PLEG): container finished" podID="d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" containerID="40d143df27f38c86086e8f336b531fdac2ea0921b36da45ef4f48a1637d3e67d" exitCode=0 Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.868606 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-682fl" event={"ID":"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35","Type":"ContainerDied","Data":"40d143df27f38c86086e8f336b531fdac2ea0921b36da45ef4f48a1637d3e67d"} Feb 23 07:01:49 crc kubenswrapper[4626]: I0223 07:01:49.996354 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e51989b-c11c-407c-a12f-dbada333bbee" path="/var/lib/kubelet/pods/3e51989b-c11c-407c-a12f-dbada333bbee/volumes" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.000483 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:01:50 crc kubenswrapper[4626]: W0223 07:01:50.007315 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf993751e_c4d5_468d_bfb3_1a0d854926c6.slice/crio-441ce226754fd597a3344853e8e89df6afde9dd05d5c567d09a57b36e4e22670 WatchSource:0}: Error finding container 441ce226754fd597a3344853e8e89df6afde9dd05d5c567d09a57b36e4e22670: Status 404 returned error can't find the container with id 441ce226754fd597a3344853e8e89df6afde9dd05d5c567d09a57b36e4e22670 Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.061932 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.061973 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.087883 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.090966 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.168648 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.244250 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c56fd79f-hhg42"] Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.244557 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerName="dnsmasq-dns" containerID="cri-o://071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9" gracePeriod=10 Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.811823 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.887341 4626 generic.go:334] "Generic (PLEG): container finished" podID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerID="071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9" exitCode=0 Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.887709 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" event={"ID":"5bc64d2f-e681-426e-a19f-1c0da0764ef7","Type":"ContainerDied","Data":"071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9"} Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.887757 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" event={"ID":"5bc64d2f-e681-426e-a19f-1c0da0764ef7","Type":"ContainerDied","Data":"1bda026ac1b2e52d0fdbd92106de0aa7f075469269df238c67e2dbe1f8de9226"} Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.887779 4626 scope.go:117] "RemoveContainer" containerID="071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.888016 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c56fd79f-hhg42" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.889658 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.889937 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.899548 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerStarted","Data":"441ce226754fd597a3344853e8e89df6afde9dd05d5c567d09a57b36e4e22670"} Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.918366 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-nb\") pod \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.918519 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-config\") pod \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.918604 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njz4p\" (UniqueName: \"kubernetes.io/projected/5bc64d2f-e681-426e-a19f-1c0da0764ef7-kube-api-access-njz4p\") pod \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.918661 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-swift-storage-0\") pod \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.918794 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-svc\") pod \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.918852 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-sb\") pod \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\" (UID: \"5bc64d2f-e681-426e-a19f-1c0da0764ef7\") " Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.938765 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc64d2f-e681-426e-a19f-1c0da0764ef7-kube-api-access-njz4p" (OuterVolumeSpecName: "kube-api-access-njz4p") pod "5bc64d2f-e681-426e-a19f-1c0da0764ef7" (UID: "5bc64d2f-e681-426e-a19f-1c0da0764ef7"). InnerVolumeSpecName "kube-api-access-njz4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:50 crc kubenswrapper[4626]: I0223 07:01:50.938967 4626 scope.go:117] "RemoveContainer" containerID="9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.028369 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njz4p\" (UniqueName: \"kubernetes.io/projected/5bc64d2f-e681-426e-a19f-1c0da0764ef7-kube-api-access-njz4p\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.062635 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-config" (OuterVolumeSpecName: "config") pod "5bc64d2f-e681-426e-a19f-1c0da0764ef7" (UID: "5bc64d2f-e681-426e-a19f-1c0da0764ef7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.076128 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bc64d2f-e681-426e-a19f-1c0da0764ef7" (UID: "5bc64d2f-e681-426e-a19f-1c0da0764ef7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.081968 4626 scope.go:117] "RemoveContainer" containerID="071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9" Feb 23 07:01:51 crc kubenswrapper[4626]: E0223 07:01:51.083112 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9\": container with ID starting with 071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9 not found: ID does not exist" containerID="071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.083151 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9"} err="failed to get container status \"071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9\": rpc error: code = NotFound desc = could not find container \"071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9\": container with ID starting with 071009f407c99132491e0e7138a1cb8d4ebf5842f2408df526d8d230530733c9 not found: ID does not exist" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.083180 4626 scope.go:117] "RemoveContainer" containerID="9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c" Feb 23 07:01:51 crc kubenswrapper[4626]: E0223 07:01:51.083567 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c\": container with ID starting with 9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c not found: ID does not exist" containerID="9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.083616 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c"} err="failed to get container status \"9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c\": rpc error: code = NotFound desc = could not find container \"9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c\": container with ID starting with 9426f5ffb92679a7cb46e579f4e76adb987634318e423edc3f3e9c3af8c4142c not found: ID does not exist" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.102308 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.112166 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bc64d2f-e681-426e-a19f-1c0da0764ef7" (UID: "5bc64d2f-e681-426e-a19f-1c0da0764ef7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.134811 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.134850 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.134863 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.196153 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5bc64d2f-e681-426e-a19f-1c0da0764ef7" (UID: "5bc64d2f-e681-426e-a19f-1c0da0764ef7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.203115 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bc64d2f-e681-426e-a19f-1c0da0764ef7" (UID: "5bc64d2f-e681-426e-a19f-1c0da0764ef7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.238418 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.238553 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bc64d2f-e681-426e-a19f-1c0da0764ef7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.518626 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.522151 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.539591 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c56fd79f-hhg42"] Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549324 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-config-data\") pod \"2405cf95-00e4-40c0-bd99-266460b42580\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549468 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-config-data\") pod \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549604 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-combined-ca-bundle\") pod \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549691 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nxnj\" (UniqueName: \"kubernetes.io/projected/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-kube-api-access-5nxnj\") pod \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549762 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-scripts\") pod \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\" (UID: \"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549825 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-scripts\") pod \"2405cf95-00e4-40c0-bd99-266460b42580\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.549901 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-combined-ca-bundle\") pod \"2405cf95-00e4-40c0-bd99-266460b42580\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.550087 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npcf8\" (UniqueName: \"kubernetes.io/projected/2405cf95-00e4-40c0-bd99-266460b42580-kube-api-access-npcf8\") pod \"2405cf95-00e4-40c0-bd99-266460b42580\" (UID: \"2405cf95-00e4-40c0-bd99-266460b42580\") " Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.551250 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c56fd79f-hhg42"] Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.562108 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2405cf95-00e4-40c0-bd99-266460b42580-kube-api-access-npcf8" (OuterVolumeSpecName: "kube-api-access-npcf8") pod "2405cf95-00e4-40c0-bd99-266460b42580" (UID: "2405cf95-00e4-40c0-bd99-266460b42580"). InnerVolumeSpecName "kube-api-access-npcf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.566743 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-scripts" (OuterVolumeSpecName: "scripts") pod "2405cf95-00e4-40c0-bd99-266460b42580" (UID: "2405cf95-00e4-40c0-bd99-266460b42580"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.574904 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-kube-api-access-5nxnj" (OuterVolumeSpecName: "kube-api-access-5nxnj") pod "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" (UID: "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35"). InnerVolumeSpecName "kube-api-access-5nxnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.587700 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-scripts" (OuterVolumeSpecName: "scripts") pod "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" (UID: "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.605896 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" (UID: "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.607287 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-config-data" (OuterVolumeSpecName: "config-data") pod "2405cf95-00e4-40c0-bd99-266460b42580" (UID: "2405cf95-00e4-40c0-bd99-266460b42580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.611784 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-config-data" (OuterVolumeSpecName: "config-data") pod "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" (UID: "d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.657621 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npcf8\" (UniqueName: \"kubernetes.io/projected/2405cf95-00e4-40c0-bd99-266460b42580-kube-api-access-npcf8\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.658428 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.658534 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.658602 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.658686 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nxnj\" (UniqueName: \"kubernetes.io/projected/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-kube-api-access-5nxnj\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.658746 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.658796 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.668959 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2405cf95-00e4-40c0-bd99-266460b42580" (UID: "2405cf95-00e4-40c0-bd99-266460b42580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.761484 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2405cf95-00e4-40c0-bd99-266460b42580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.913130 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pbxnw" event={"ID":"2405cf95-00e4-40c0-bd99-266460b42580","Type":"ContainerDied","Data":"73bd6ae43992adc3f7d1a921d0de546788fbc24b5f681dcfc9a45459060d294f"} Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.913189 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73bd6ae43992adc3f7d1a921d0de546788fbc24b5f681dcfc9a45459060d294f" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.913283 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pbxnw" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.920792 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-682fl" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.920777 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-682fl" event={"ID":"d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35","Type":"ContainerDied","Data":"08683409904bef95935dea04544df2839e75a36c90c8ad43a53b9afdbb7e1de1"} Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.920910 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08683409904bef95935dea04544df2839e75a36c90c8ad43a53b9afdbb7e1de1" Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.935937 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerStarted","Data":"66094bc2e761357522e543c937f935380e8ad08b5933dcac9f9e8291a512e409"} Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.935995 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerStarted","Data":"2e4d032ef1dea3265aaa43d4f69072e51d419c797a7e20cd8a9ce51ef2cea18e"} Feb 23 07:01:51 crc kubenswrapper[4626]: I0223 07:01:51.991652 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" path="/var/lib/kubelet/pods/5bc64d2f-e681-426e-a19f-1c0da0764ef7/volumes" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.063440 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:01:52 crc kubenswrapper[4626]: E0223 07:01:52.064396 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerName="dnsmasq-dns" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.064532 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerName="dnsmasq-dns" Feb 23 07:01:52 crc kubenswrapper[4626]: E0223 07:01:52.064614 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" containerName="nova-cell1-conductor-db-sync" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.064676 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" containerName="nova-cell1-conductor-db-sync" Feb 23 07:01:52 crc kubenswrapper[4626]: E0223 07:01:52.064747 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2405cf95-00e4-40c0-bd99-266460b42580" containerName="nova-manage" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.064812 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2405cf95-00e4-40c0-bd99-266460b42580" containerName="nova-manage" Feb 23 07:01:52 crc kubenswrapper[4626]: E0223 07:01:52.064871 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerName="init" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.064939 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerName="init" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.065299 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" containerName="nova-cell1-conductor-db-sync" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.065379 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2405cf95-00e4-40c0-bd99-266460b42580" containerName="nova-manage" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.065438 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc64d2f-e681-426e-a19f-1c0da0764ef7" containerName="dnsmasq-dns" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.066518 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.073356 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.084537 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.123551 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.123817 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-log" containerID="cri-o://974d19150e8e9faf830024e02086ef38bb2916626861bc70bb97ce8ffc4223c8" gracePeriod=30 Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.124276 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-api" containerID="cri-o://9851aebfa9b681137ce6eef38ab0ce44c7e3e32cdd89e62e9ef9267310d20216" gracePeriod=30 Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.170325 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8470cfd-1a2b-4e2b-b59a-10f5de602156-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.170373 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8470cfd-1a2b-4e2b-b59a-10f5de602156-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.170521 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58wml\" (UniqueName: \"kubernetes.io/projected/a8470cfd-1a2b-4e2b-b59a-10f5de602156-kube-api-access-58wml\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.273398 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8470cfd-1a2b-4e2b-b59a-10f5de602156-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.273443 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8470cfd-1a2b-4e2b-b59a-10f5de602156-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.273551 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58wml\" (UniqueName: \"kubernetes.io/projected/a8470cfd-1a2b-4e2b-b59a-10f5de602156-kube-api-access-58wml\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.279027 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8470cfd-1a2b-4e2b-b59a-10f5de602156-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.280048 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8470cfd-1a2b-4e2b-b59a-10f5de602156-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.298175 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58wml\" (UniqueName: \"kubernetes.io/projected/a8470cfd-1a2b-4e2b-b59a-10f5de602156-kube-api-access-58wml\") pod \"nova-cell1-conductor-0\" (UID: \"a8470cfd-1a2b-4e2b-b59a-10f5de602156\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.388746 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.554428 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.949724 4626 generic.go:334] "Generic (PLEG): container finished" podID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerID="974d19150e8e9faf830024e02086ef38bb2916626861bc70bb97ce8ffc4223c8" exitCode=143 Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.949806 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66e0eb4d-4467-485f-9203-e4674b950a2b","Type":"ContainerDied","Data":"974d19150e8e9faf830024e02086ef38bb2916626861bc70bb97ce8ffc4223c8"} Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.953523 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" containerName="nova-scheduler-scheduler" containerID="cri-o://3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67" gracePeriod=30 Feb 23 07:01:52 crc kubenswrapper[4626]: I0223 07:01:52.953702 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerStarted","Data":"0decb5826f71a9a5d6bffa2a751dfdbc46626783802f763c4a86beab9136c829"} Feb 23 07:01:53 crc kubenswrapper[4626]: I0223 07:01:53.008325 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:01:53 crc kubenswrapper[4626]: I0223 07:01:53.962928 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a8470cfd-1a2b-4e2b-b59a-10f5de602156","Type":"ContainerStarted","Data":"ed383653610c14890f88f58249b4f9e6bfdd9ec1aac5edf558c8aa1024e4f15e"} Feb 23 07:01:53 crc kubenswrapper[4626]: I0223 07:01:53.963374 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a8470cfd-1a2b-4e2b-b59a-10f5de602156","Type":"ContainerStarted","Data":"75cb581124e5597943721e81648ee151ca1cc4ea921b7d5f545b4ac005dc6286"} Feb 23 07:01:53 crc kubenswrapper[4626]: I0223 07:01:53.963394 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 07:01:53 crc kubenswrapper[4626]: I0223 07:01:53.985596 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.985563327 podStartE2EDuration="1.985563327s" podCreationTimestamp="2026-02-23 07:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:53.976607982 +0000 UTC m=+1266.315937248" watchObservedRunningTime="2026-02-23 07:01:53.985563327 +0000 UTC m=+1266.324892593" Feb 23 07:01:54 crc kubenswrapper[4626]: I0223 07:01:54.981307 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerStarted","Data":"dd82fa83fa96957fb43e955f2d371bd0c2b8bb41de177a21cafdc79ee2f899ee"} Feb 23 07:01:54 crc kubenswrapper[4626]: I0223 07:01:54.981597 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:01:55 crc kubenswrapper[4626]: E0223 07:01:55.063707 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:01:55 crc kubenswrapper[4626]: E0223 07:01:55.064863 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:01:55 crc kubenswrapper[4626]: E0223 07:01:55.066204 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:01:55 crc kubenswrapper[4626]: E0223 07:01:55.066235 4626 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" containerName="nova-scheduler-scheduler" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.040234 4626 generic.go:334] "Generic (PLEG): container finished" podID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerID="9851aebfa9b681137ce6eef38ab0ce44c7e3e32cdd89e62e9ef9267310d20216" exitCode=0 Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.044031 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66e0eb4d-4467-485f-9203-e4674b950a2b","Type":"ContainerDied","Data":"9851aebfa9b681137ce6eef38ab0ce44c7e3e32cdd89e62e9ef9267310d20216"} Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.056933 4626 generic.go:334] "Generic (PLEG): container finished" podID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" containerID="3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67" exitCode=0 Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.059221 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2360f12-c181-43dc-81b0-cd6ed0ca05f2","Type":"ContainerDied","Data":"3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67"} Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.330648 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.337902 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.354392 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.50052234 podStartE2EDuration="8.35437456s" podCreationTimestamp="2026-02-23 07:01:49 +0000 UTC" firstStartedPulling="2026-02-23 07:01:50.009986533 +0000 UTC m=+1262.349315799" lastFinishedPulling="2026-02-23 07:01:53.863838752 +0000 UTC m=+1266.203168019" observedRunningTime="2026-02-23 07:01:55.005180269 +0000 UTC m=+1267.344509535" watchObservedRunningTime="2026-02-23 07:01:57.35437456 +0000 UTC m=+1269.693703826" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415045 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlrvp\" (UniqueName: \"kubernetes.io/projected/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-kube-api-access-tlrvp\") pod \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415103 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66e0eb4d-4467-485f-9203-e4674b950a2b-logs\") pod \"66e0eb4d-4467-485f-9203-e4674b950a2b\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415234 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-combined-ca-bundle\") pod \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415401 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-config-data\") pod \"66e0eb4d-4467-485f-9203-e4674b950a2b\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415422 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-combined-ca-bundle\") pod \"66e0eb4d-4467-485f-9203-e4674b950a2b\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415529 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4572\" (UniqueName: \"kubernetes.io/projected/66e0eb4d-4467-485f-9203-e4674b950a2b-kube-api-access-r4572\") pod \"66e0eb4d-4467-485f-9203-e4674b950a2b\" (UID: \"66e0eb4d-4467-485f-9203-e4674b950a2b\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.415582 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-config-data\") pod \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\" (UID: \"c2360f12-c181-43dc-81b0-cd6ed0ca05f2\") " Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.417782 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e0eb4d-4467-485f-9203-e4674b950a2b-logs" (OuterVolumeSpecName: "logs") pod "66e0eb4d-4467-485f-9203-e4674b950a2b" (UID: "66e0eb4d-4467-485f-9203-e4674b950a2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.424985 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e0eb4d-4467-485f-9203-e4674b950a2b-kube-api-access-r4572" (OuterVolumeSpecName: "kube-api-access-r4572") pod "66e0eb4d-4467-485f-9203-e4674b950a2b" (UID: "66e0eb4d-4467-485f-9203-e4674b950a2b"). InnerVolumeSpecName "kube-api-access-r4572". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.425460 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-kube-api-access-tlrvp" (OuterVolumeSpecName: "kube-api-access-tlrvp") pod "c2360f12-c181-43dc-81b0-cd6ed0ca05f2" (UID: "c2360f12-c181-43dc-81b0-cd6ed0ca05f2"). InnerVolumeSpecName "kube-api-access-tlrvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.452956 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-config-data" (OuterVolumeSpecName: "config-data") pod "c2360f12-c181-43dc-81b0-cd6ed0ca05f2" (UID: "c2360f12-c181-43dc-81b0-cd6ed0ca05f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.457725 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66e0eb4d-4467-485f-9203-e4674b950a2b" (UID: "66e0eb4d-4467-485f-9203-e4674b950a2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.457943 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-config-data" (OuterVolumeSpecName: "config-data") pod "66e0eb4d-4467-485f-9203-e4674b950a2b" (UID: "66e0eb4d-4467-485f-9203-e4674b950a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.464189 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2360f12-c181-43dc-81b0-cd6ed0ca05f2" (UID: "c2360f12-c181-43dc-81b0-cd6ed0ca05f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518586 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4572\" (UniqueName: \"kubernetes.io/projected/66e0eb4d-4467-485f-9203-e4674b950a2b-kube-api-access-r4572\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518621 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518636 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlrvp\" (UniqueName: \"kubernetes.io/projected/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-kube-api-access-tlrvp\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518647 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66e0eb4d-4467-485f-9203-e4674b950a2b-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518657 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2360f12-c181-43dc-81b0-cd6ed0ca05f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518666 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:57 crc kubenswrapper[4626]: I0223 07:01:57.518675 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66e0eb4d-4467-485f-9203-e4674b950a2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.077664 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"66e0eb4d-4467-485f-9203-e4674b950a2b","Type":"ContainerDied","Data":"5abf35ccf542474def5946becb75e89575df145775859a583353b4061ada79ad"} Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.077718 4626 scope.go:117] "RemoveContainer" containerID="9851aebfa9b681137ce6eef38ab0ce44c7e3e32cdd89e62e9ef9267310d20216" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.077848 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.092888 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2360f12-c181-43dc-81b0-cd6ed0ca05f2","Type":"ContainerDied","Data":"19e58eb4f3bb1b0c707daa908f5c515b2ae92ee6e9dd5ea25d6ee570646bdeb4"} Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.092979 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.137665 4626 scope.go:117] "RemoveContainer" containerID="974d19150e8e9faf830024e02086ef38bb2916626861bc70bb97ce8ffc4223c8" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.142549 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.159578 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.175410 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.185652 4626 scope.go:117] "RemoveContainer" containerID="3a30044cd46511b52a1809a83f4ef4659deeb9400470d6c1966b03adf363cc67" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.188531 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.198566 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: E0223 07:01:58.199059 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" containerName="nova-scheduler-scheduler" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.199077 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" containerName="nova-scheduler-scheduler" Feb 23 07:01:58 crc kubenswrapper[4626]: E0223 07:01:58.199095 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-log" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.199101 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-log" Feb 23 07:01:58 crc kubenswrapper[4626]: E0223 07:01:58.199135 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-api" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.199141 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-api" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.199331 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-log" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.199354 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" containerName="nova-scheduler-scheduler" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.199367 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" containerName="nova-api-api" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.200064 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.204937 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.206565 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.215551 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.219123 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.220463 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.221190 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338571 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338663 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52ad1719-e334-43ac-ab3e-e36be1e23fb3-logs\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338702 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxf7\" (UniqueName: \"kubernetes.io/projected/f460fe0c-1d4d-4457-8800-6cc7d9a12279-kube-api-access-9cxf7\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338759 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-config-data\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338785 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfhx4\" (UniqueName: \"kubernetes.io/projected/52ad1719-e334-43ac-ab3e-e36be1e23fb3-kube-api-access-jfhx4\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338834 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-config-data\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.338871 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441186 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441337 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52ad1719-e334-43ac-ab3e-e36be1e23fb3-logs\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441400 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxf7\" (UniqueName: \"kubernetes.io/projected/f460fe0c-1d4d-4457-8800-6cc7d9a12279-kube-api-access-9cxf7\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441486 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-config-data\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441533 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfhx4\" (UniqueName: \"kubernetes.io/projected/52ad1719-e334-43ac-ab3e-e36be1e23fb3-kube-api-access-jfhx4\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441571 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-config-data\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.441646 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.442194 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52ad1719-e334-43ac-ab3e-e36be1e23fb3-logs\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.450956 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-config-data\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.451660 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.456405 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.459055 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-config-data\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.459998 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxf7\" (UniqueName: \"kubernetes.io/projected/f460fe0c-1d4d-4457-8800-6cc7d9a12279-kube-api-access-9cxf7\") pod \"nova-scheduler-0\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.462714 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfhx4\" (UniqueName: \"kubernetes.io/projected/52ad1719-e334-43ac-ab3e-e36be1e23fb3-kube-api-access-jfhx4\") pod \"nova-api-0\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " pod="openstack/nova-api-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.578638 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:01:58 crc kubenswrapper[4626]: I0223 07:01:58.612841 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:01:59 crc kubenswrapper[4626]: I0223 07:01:59.158058 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:01:59 crc kubenswrapper[4626]: I0223 07:01:59.271679 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:01:59 crc kubenswrapper[4626]: I0223 07:01:59.994893 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e0eb4d-4467-485f-9203-e4674b950a2b" path="/var/lib/kubelet/pods/66e0eb4d-4467-485f-9203-e4674b950a2b/volumes" Feb 23 07:01:59 crc kubenswrapper[4626]: I0223 07:01:59.996126 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2360f12-c181-43dc-81b0-cd6ed0ca05f2" path="/var/lib/kubelet/pods/c2360f12-c181-43dc-81b0-cd6ed0ca05f2/volumes" Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.121739 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f460fe0c-1d4d-4457-8800-6cc7d9a12279","Type":"ContainerStarted","Data":"dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62"} Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.121793 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f460fe0c-1d4d-4457-8800-6cc7d9a12279","Type":"ContainerStarted","Data":"69fc982d0be94e0f8cf023bc5fda2ebcb54ddb93cac56f86a3de01ac3c1d4bca"} Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.128748 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52ad1719-e334-43ac-ab3e-e36be1e23fb3","Type":"ContainerStarted","Data":"37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba"} Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.128905 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52ad1719-e334-43ac-ab3e-e36be1e23fb3","Type":"ContainerStarted","Data":"0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457"} Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.129005 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52ad1719-e334-43ac-ab3e-e36be1e23fb3","Type":"ContainerStarted","Data":"8b1e8ea8a54ebed97158ba7b2675a3ff9c8eda0e8692e2c4288c7ef9c774bfc8"} Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.153080 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.153062316 podStartE2EDuration="2.153062316s" podCreationTimestamp="2026-02-23 07:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:00.14591996 +0000 UTC m=+1272.485249226" watchObservedRunningTime="2026-02-23 07:02:00.153062316 +0000 UTC m=+1272.492391583" Feb 23 07:02:00 crc kubenswrapper[4626]: I0223 07:02:00.171089 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.17107181 podStartE2EDuration="2.17107181s" podCreationTimestamp="2026-02-23 07:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:00.163712305 +0000 UTC m=+1272.503041572" watchObservedRunningTime="2026-02-23 07:02:00.17107181 +0000 UTC m=+1272.510401077" Feb 23 07:02:02 crc kubenswrapper[4626]: I0223 07:02:02.627295 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 07:02:03 crc kubenswrapper[4626]: I0223 07:02:03.579209 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:02:08 crc kubenswrapper[4626]: I0223 07:02:08.579235 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:02:08 crc kubenswrapper[4626]: I0223 07:02:08.606880 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:02:08 crc kubenswrapper[4626]: I0223 07:02:08.613833 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:02:08 crc kubenswrapper[4626]: I0223 07:02:08.613868 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:02:09 crc kubenswrapper[4626]: I0223 07:02:09.242191 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:02:09 crc kubenswrapper[4626]: I0223 07:02:09.696657 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:09 crc kubenswrapper[4626]: I0223 07:02:09.696826 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:15 crc kubenswrapper[4626]: E0223 07:02:15.986215 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc739604_b750_4eb0_ada8_55c64c50badd.slice/crio-611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc739604_b750_4eb0_ada8_55c64c50badd.slice/crio-conmon-611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd778a9bb_ee5d_4974_909f_3a16b3db565b.slice/crio-e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.237965 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.244598 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.250652 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-combined-ca-bundle\") pod \"bc739604-b750-4eb0-ada8-55c64c50badd\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.250716 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-config-data\") pod \"d778a9bb-ee5d-4974-909f-3a16b3db565b\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.250840 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-combined-ca-bundle\") pod \"d778a9bb-ee5d-4974-909f-3a16b3db565b\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.250999 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc739604-b750-4eb0-ada8-55c64c50badd-logs\") pod \"bc739604-b750-4eb0-ada8-55c64c50badd\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.251143 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-config-data\") pod \"bc739604-b750-4eb0-ada8-55c64c50badd\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.251237 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xlt\" (UniqueName: \"kubernetes.io/projected/bc739604-b750-4eb0-ada8-55c64c50badd-kube-api-access-r5xlt\") pod \"bc739604-b750-4eb0-ada8-55c64c50badd\" (UID: \"bc739604-b750-4eb0-ada8-55c64c50badd\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.251325 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrl4\" (UniqueName: \"kubernetes.io/projected/d778a9bb-ee5d-4974-909f-3a16b3db565b-kube-api-access-xzrl4\") pod \"d778a9bb-ee5d-4974-909f-3a16b3db565b\" (UID: \"d778a9bb-ee5d-4974-909f-3a16b3db565b\") " Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.251669 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc739604-b750-4eb0-ada8-55c64c50badd-logs" (OuterVolumeSpecName: "logs") pod "bc739604-b750-4eb0-ada8-55c64c50badd" (UID: "bc739604-b750-4eb0-ada8-55c64c50badd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.252412 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc739604-b750-4eb0-ada8-55c64c50badd-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.257818 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc739604-b750-4eb0-ada8-55c64c50badd-kube-api-access-r5xlt" (OuterVolumeSpecName: "kube-api-access-r5xlt") pod "bc739604-b750-4eb0-ada8-55c64c50badd" (UID: "bc739604-b750-4eb0-ada8-55c64c50badd"). InnerVolumeSpecName "kube-api-access-r5xlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.265853 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d778a9bb-ee5d-4974-909f-3a16b3db565b-kube-api-access-xzrl4" (OuterVolumeSpecName: "kube-api-access-xzrl4") pod "d778a9bb-ee5d-4974-909f-3a16b3db565b" (UID: "d778a9bb-ee5d-4974-909f-3a16b3db565b"). InnerVolumeSpecName "kube-api-access-xzrl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.279740 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-config-data" (OuterVolumeSpecName: "config-data") pod "d778a9bb-ee5d-4974-909f-3a16b3db565b" (UID: "d778a9bb-ee5d-4974-909f-3a16b3db565b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.294140 4626 generic.go:334] "Generic (PLEG): container finished" podID="d778a9bb-ee5d-4974-909f-3a16b3db565b" containerID="e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc" exitCode=137 Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.294196 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d778a9bb-ee5d-4974-909f-3a16b3db565b","Type":"ContainerDied","Data":"e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc"} Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.294225 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d778a9bb-ee5d-4974-909f-3a16b3db565b","Type":"ContainerDied","Data":"c1e1d243c0cd34c4fab6bafe89a8c5dbcb8599c3c04c38b0301f0928ab0b7e0e"} Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.294242 4626 scope.go:117] "RemoveContainer" containerID="e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.294348 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.297557 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-config-data" (OuterVolumeSpecName: "config-data") pod "bc739604-b750-4eb0-ada8-55c64c50badd" (UID: "bc739604-b750-4eb0-ada8-55c64c50badd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.298327 4626 generic.go:334] "Generic (PLEG): container finished" podID="bc739604-b750-4eb0-ada8-55c64c50badd" containerID="611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f" exitCode=137 Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.298351 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc739604-b750-4eb0-ada8-55c64c50badd","Type":"ContainerDied","Data":"611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f"} Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.298367 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bc739604-b750-4eb0-ada8-55c64c50badd","Type":"ContainerDied","Data":"3e4401e0903637404b11ce6090be1cc7f9518fde6f358547a47c1de0d45298e9"} Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.298423 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.299167 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d778a9bb-ee5d-4974-909f-3a16b3db565b" (UID: "d778a9bb-ee5d-4974-909f-3a16b3db565b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.313786 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc739604-b750-4eb0-ada8-55c64c50badd" (UID: "bc739604-b750-4eb0-ada8-55c64c50badd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.321388 4626 scope.go:117] "RemoveContainer" containerID="e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc" Feb 23 07:02:16 crc kubenswrapper[4626]: E0223 07:02:16.321783 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc\": container with ID starting with e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc not found: ID does not exist" containerID="e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.321857 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc"} err="failed to get container status \"e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc\": rpc error: code = NotFound desc = could not find container \"e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc\": container with ID starting with e875125475094d904a8fd2b78fd7ec288f5de76626d89373b35233591c6475bc not found: ID does not exist" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.321879 4626 scope.go:117] "RemoveContainer" containerID="611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.340796 4626 scope.go:117] "RemoveContainer" containerID="99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.354945 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.355023 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.355037 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5xlt\" (UniqueName: \"kubernetes.io/projected/bc739604-b750-4eb0-ada8-55c64c50badd-kube-api-access-r5xlt\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.355051 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzrl4\" (UniqueName: \"kubernetes.io/projected/d778a9bb-ee5d-4974-909f-3a16b3db565b-kube-api-access-xzrl4\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.355061 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc739604-b750-4eb0-ada8-55c64c50badd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.355069 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d778a9bb-ee5d-4974-909f-3a16b3db565b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.357668 4626 scope.go:117] "RemoveContainer" containerID="611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f" Feb 23 07:02:16 crc kubenswrapper[4626]: E0223 07:02:16.358005 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f\": container with ID starting with 611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f not found: ID does not exist" containerID="611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.358043 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f"} err="failed to get container status \"611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f\": rpc error: code = NotFound desc = could not find container \"611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f\": container with ID starting with 611eecca4911947eed380c60db3a1c6e1e10bfa3863bf2faff64d212612a387f not found: ID does not exist" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.358067 4626 scope.go:117] "RemoveContainer" containerID="99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da" Feb 23 07:02:16 crc kubenswrapper[4626]: E0223 07:02:16.358358 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da\": container with ID starting with 99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da not found: ID does not exist" containerID="99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.358383 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da"} err="failed to get container status \"99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da\": rpc error: code = NotFound desc = could not find container \"99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da\": container with ID starting with 99181d17f20707d10a34ce892570778896412a0f8e6775abe2e290995141e9da not found: ID does not exist" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.641913 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.659211 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.692635 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.703285 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: E0223 07:02:16.703813 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d778a9bb-ee5d-4974-909f-3a16b3db565b" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.703834 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d778a9bb-ee5d-4974-909f-3a16b3db565b" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:02:16 crc kubenswrapper[4626]: E0223 07:02:16.703846 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-metadata" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.703852 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-metadata" Feb 23 07:02:16 crc kubenswrapper[4626]: E0223 07:02:16.703879 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-log" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.703885 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-log" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.704109 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-metadata" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.704127 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" containerName="nova-metadata-log" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.704139 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d778a9bb-ee5d-4974-909f-3a16b3db565b" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.704869 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.709529 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.710467 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.711674 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.714277 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.722490 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.728575 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.730275 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.731760 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.732012 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.736597 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767469 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767559 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767605 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e80536-e515-42fe-946e-efccabf4a93e-logs\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767627 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767646 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-config-data\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767668 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lx2l\" (UniqueName: \"kubernetes.io/projected/85e80536-e515-42fe-946e-efccabf4a93e-kube-api-access-9lx2l\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.767974 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.768044 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.768267 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqzt8\" (UniqueName: \"kubernetes.io/projected/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-kube-api-access-xqzt8\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.768451 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.870853 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.870901 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.870982 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqzt8\" (UniqueName: \"kubernetes.io/projected/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-kube-api-access-xqzt8\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871060 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871101 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871165 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871227 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e80536-e515-42fe-946e-efccabf4a93e-logs\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871254 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871272 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-config-data\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871291 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lx2l\" (UniqueName: \"kubernetes.io/projected/85e80536-e515-42fe-946e-efccabf4a93e-kube-api-access-9lx2l\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.871913 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e80536-e515-42fe-946e-efccabf4a93e-logs\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.874942 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.876432 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.876691 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.878398 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.879083 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-config-data\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.884276 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.885610 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.887700 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lx2l\" (UniqueName: \"kubernetes.io/projected/85e80536-e515-42fe-946e-efccabf4a93e-kube-api-access-9lx2l\") pod \"nova-metadata-0\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " pod="openstack/nova-metadata-0" Feb 23 07:02:16 crc kubenswrapper[4626]: I0223 07:02:16.888378 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqzt8\" (UniqueName: \"kubernetes.io/projected/bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9-kube-api-access-xqzt8\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:17 crc kubenswrapper[4626]: I0223 07:02:17.023067 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:17 crc kubenswrapper[4626]: I0223 07:02:17.046302 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:17 crc kubenswrapper[4626]: I0223 07:02:17.456186 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:02:17 crc kubenswrapper[4626]: W0223 07:02:17.458588 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbcfc4cd_8abe_43d1_88c9_7ebb07787fb9.slice/crio-9572475ed472e8c7c816011a160dfa045fc31238774e447fa548e47158a2104e WatchSource:0}: Error finding container 9572475ed472e8c7c816011a160dfa045fc31238774e447fa548e47158a2104e: Status 404 returned error can't find the container with id 9572475ed472e8c7c816011a160dfa045fc31238774e447fa548e47158a2104e Feb 23 07:02:17 crc kubenswrapper[4626]: I0223 07:02:17.528013 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:17 crc kubenswrapper[4626]: I0223 07:02:17.994819 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc739604-b750-4eb0-ada8-55c64c50badd" path="/var/lib/kubelet/pods/bc739604-b750-4eb0-ada8-55c64c50badd/volumes" Feb 23 07:02:17 crc kubenswrapper[4626]: I0223 07:02:17.996084 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d778a9bb-ee5d-4974-909f-3a16b3db565b" path="/var/lib/kubelet/pods/d778a9bb-ee5d-4974-909f-3a16b3db565b/volumes" Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.320451 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9","Type":"ContainerStarted","Data":"dbbd97ce7ceaf77dd116530658d0942d309e10fa76989d46c380fd2c5b9ee6a5"} Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.321722 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9","Type":"ContainerStarted","Data":"9572475ed472e8c7c816011a160dfa045fc31238774e447fa548e47158a2104e"} Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.323988 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"85e80536-e515-42fe-946e-efccabf4a93e","Type":"ContainerStarted","Data":"c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361"} Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.324042 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"85e80536-e515-42fe-946e-efccabf4a93e","Type":"ContainerStarted","Data":"a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204"} Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.324055 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"85e80536-e515-42fe-946e-efccabf4a93e","Type":"ContainerStarted","Data":"02da5d487f8f6feb587954c8171db90618d0108f49a2588197a95027af16ab70"} Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.358429 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.358407034 podStartE2EDuration="2.358407034s" podCreationTimestamp="2026-02-23 07:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:18.353365308 +0000 UTC m=+1290.692694574" watchObservedRunningTime="2026-02-23 07:02:18.358407034 +0000 UTC m=+1290.697736300" Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.374556 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.374534911 podStartE2EDuration="2.374534911s" podCreationTimestamp="2026-02-23 07:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:18.3725671 +0000 UTC m=+1290.711896366" watchObservedRunningTime="2026-02-23 07:02:18.374534911 +0000 UTC m=+1290.713864176" Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.618651 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.619392 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.620334 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:02:18 crc kubenswrapper[4626]: I0223 07:02:18.622266 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.335441 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.340133 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.549203 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-746ddbbc65-7nl6w"] Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.552668 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.573625 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746ddbbc65-7nl6w"] Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.581675 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.655300 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-swift-storage-0\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.655358 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-nb\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.655387 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-sb\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.655417 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-config\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.655461 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thpz\" (UniqueName: \"kubernetes.io/projected/5b99373e-a436-4150-b209-0c68797de11e-kube-api-access-7thpz\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.655745 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-svc\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.757355 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-svc\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.758345 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-swift-storage-0\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.758398 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-nb\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.758437 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-sb\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.758471 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-config\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.758554 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thpz\" (UniqueName: \"kubernetes.io/projected/5b99373e-a436-4150-b209-0c68797de11e-kube-api-access-7thpz\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.758265 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-svc\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.759513 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-swift-storage-0\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.760074 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-nb\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.760615 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-sb\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.761131 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-config\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.778581 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thpz\" (UniqueName: \"kubernetes.io/projected/5b99373e-a436-4150-b209-0c68797de11e-kube-api-access-7thpz\") pod \"dnsmasq-dns-746ddbbc65-7nl6w\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:19 crc kubenswrapper[4626]: I0223 07:02:19.887722 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:20 crc kubenswrapper[4626]: I0223 07:02:20.408997 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746ddbbc65-7nl6w"] Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.357821 4626 generic.go:334] "Generic (PLEG): container finished" podID="5b99373e-a436-4150-b209-0c68797de11e" containerID="714c8abfefee636df50db9183876aaea31f1ac358cf90edb605723698af4399e" exitCode=0 Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.357893 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" event={"ID":"5b99373e-a436-4150-b209-0c68797de11e","Type":"ContainerDied","Data":"714c8abfefee636df50db9183876aaea31f1ac358cf90edb605723698af4399e"} Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.358206 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" event={"ID":"5b99373e-a436-4150-b209-0c68797de11e","Type":"ContainerStarted","Data":"3ad139070188a7b901a3f5bed581583e79924e5151957102fe2eac0bf122f471"} Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.667937 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.668448 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-central-agent" containerID="cri-o://2e4d032ef1dea3265aaa43d4f69072e51d419c797a7e20cd8a9ce51ef2cea18e" gracePeriod=30 Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.668542 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="proxy-httpd" containerID="cri-o://dd82fa83fa96957fb43e955f2d371bd0c2b8bb41de177a21cafdc79ee2f899ee" gracePeriod=30 Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.668611 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="sg-core" containerID="cri-o://0decb5826f71a9a5d6bffa2a751dfdbc46626783802f763c4a86beab9136c829" gracePeriod=30 Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.668658 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-notification-agent" containerID="cri-o://66094bc2e761357522e543c937f935380e8ad08b5933dcac9f9e8291a512e409" gracePeriod=30 Feb 23 07:02:21 crc kubenswrapper[4626]: I0223 07:02:21.887054 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.023902 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.047248 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.047291 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.388570 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" event={"ID":"5b99373e-a436-4150-b209-0c68797de11e","Type":"ContainerStarted","Data":"75f7038c6583ef2394671fa64d9fe699d65c2b64bbcae274e4afd7764b492157"} Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.390092 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398385 4626 generic.go:334] "Generic (PLEG): container finished" podID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerID="dd82fa83fa96957fb43e955f2d371bd0c2b8bb41de177a21cafdc79ee2f899ee" exitCode=0 Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398412 4626 generic.go:334] "Generic (PLEG): container finished" podID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerID="0decb5826f71a9a5d6bffa2a751dfdbc46626783802f763c4a86beab9136c829" exitCode=2 Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398419 4626 generic.go:334] "Generic (PLEG): container finished" podID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerID="2e4d032ef1dea3265aaa43d4f69072e51d419c797a7e20cd8a9ce51ef2cea18e" exitCode=0 Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398643 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-log" containerID="cri-o://0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457" gracePeriod=30 Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398916 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerDied","Data":"dd82fa83fa96957fb43e955f2d371bd0c2b8bb41de177a21cafdc79ee2f899ee"} Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398945 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerDied","Data":"0decb5826f71a9a5d6bffa2a751dfdbc46626783802f763c4a86beab9136c829"} Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.398958 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerDied","Data":"2e4d032ef1dea3265aaa43d4f69072e51d419c797a7e20cd8a9ce51ef2cea18e"} Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.399037 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-api" containerID="cri-o://37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba" gracePeriod=30 Feb 23 07:02:22 crc kubenswrapper[4626]: I0223 07:02:22.424892 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" podStartSLOduration=3.424879527 podStartE2EDuration="3.424879527s" podCreationTimestamp="2026-02-23 07:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:22.419858681 +0000 UTC m=+1294.759187948" watchObservedRunningTime="2026-02-23 07:02:22.424879527 +0000 UTC m=+1294.764208794" Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.435352 4626 generic.go:334] "Generic (PLEG): container finished" podID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerID="66094bc2e761357522e543c937f935380e8ad08b5933dcac9f9e8291a512e409" exitCode=0 Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.435805 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerDied","Data":"66094bc2e761357522e543c937f935380e8ad08b5933dcac9f9e8291a512e409"} Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.438804 4626 generic.go:334] "Generic (PLEG): container finished" podID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerID="0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457" exitCode=143 Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.438912 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52ad1719-e334-43ac-ab3e-e36be1e23fb3","Type":"ContainerDied","Data":"0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457"} Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.703083 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775392 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-combined-ca-bundle\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775549 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-log-httpd\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775567 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-run-httpd\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775583 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-scripts\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775634 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-config-data\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775668 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-sg-core-conf-yaml\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.775710 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gtf\" (UniqueName: \"kubernetes.io/projected/f993751e-c4d5-468d-bfb3-1a0d854926c6-kube-api-access-m2gtf\") pod \"f993751e-c4d5-468d-bfb3-1a0d854926c6\" (UID: \"f993751e-c4d5-468d-bfb3-1a0d854926c6\") " Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.777017 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.777159 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.783737 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-scripts" (OuterVolumeSpecName: "scripts") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:23 crc kubenswrapper[4626]: I0223 07:02:23.813346 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f993751e-c4d5-468d-bfb3-1a0d854926c6-kube-api-access-m2gtf" (OuterVolumeSpecName: "kube-api-access-m2gtf") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "kube-api-access-m2gtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:23.886111 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:23.886139 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f993751e-c4d5-468d-bfb3-1a0d854926c6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:23.886149 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:23.886158 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2gtf\" (UniqueName: \"kubernetes.io/projected/f993751e-c4d5-468d-bfb3-1a0d854926c6-kube-api-access-m2gtf\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:23.919661 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:23.994042 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.027061 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-config-data" (OuterVolumeSpecName: "config-data") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.097585 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.107649 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f993751e-c4d5-468d-bfb3-1a0d854926c6" (UID: "f993751e-c4d5-468d-bfb3-1a0d854926c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.199289 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f993751e-c4d5-468d-bfb3-1a0d854926c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.449410 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.449405 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f993751e-c4d5-468d-bfb3-1a0d854926c6","Type":"ContainerDied","Data":"441ce226754fd597a3344853e8e89df6afde9dd05d5c567d09a57b36e4e22670"} Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.449827 4626 scope.go:117] "RemoveContainer" containerID="dd82fa83fa96957fb43e955f2d371bd0c2b8bb41de177a21cafdc79ee2f899ee" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.470239 4626 scope.go:117] "RemoveContainer" containerID="0decb5826f71a9a5d6bffa2a751dfdbc46626783802f763c4a86beab9136c829" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.488378 4626 scope.go:117] "RemoveContainer" containerID="66094bc2e761357522e543c937f935380e8ad08b5933dcac9f9e8291a512e409" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.497714 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.513359 4626 scope.go:117] "RemoveContainer" containerID="2e4d032ef1dea3265aaa43d4f69072e51d419c797a7e20cd8a9ce51ef2cea18e" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.516392 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.533584 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:24 crc kubenswrapper[4626]: E0223 07:02:24.534587 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="sg-core" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534610 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="sg-core" Feb 23 07:02:24 crc kubenswrapper[4626]: E0223 07:02:24.534632 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="proxy-httpd" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534638 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="proxy-httpd" Feb 23 07:02:24 crc kubenswrapper[4626]: E0223 07:02:24.534653 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-central-agent" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534662 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-central-agent" Feb 23 07:02:24 crc kubenswrapper[4626]: E0223 07:02:24.534686 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-notification-agent" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534691 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-notification-agent" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534897 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="sg-core" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534922 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-notification-agent" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534935 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="ceilometer-central-agent" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.534944 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" containerName="proxy-httpd" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.536626 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.539284 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.543542 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.544434 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.609719 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.609928 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-config-data\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.610010 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q9jk\" (UniqueName: \"kubernetes.io/projected/0ec5912f-17d6-485b-acf4-d5017b785561-kube-api-access-8q9jk\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.610039 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-scripts\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.610177 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-run-httpd\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.610351 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.610693 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-log-httpd\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.713287 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-run-httpd\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.713624 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.713865 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-run-httpd\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.713875 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-log-httpd\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.714005 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.714227 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-config-data\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.714318 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q9jk\" (UniqueName: \"kubernetes.io/projected/0ec5912f-17d6-485b-acf4-d5017b785561-kube-api-access-8q9jk\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.714356 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-scripts\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.714866 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-log-httpd\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.721059 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.721964 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.729529 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-config-data\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.732810 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q9jk\" (UniqueName: \"kubernetes.io/projected/0ec5912f-17d6-485b-acf4-d5017b785561-kube-api-access-8q9jk\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.741161 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-scripts\") pod \"ceilometer-0\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " pod="openstack/ceilometer-0" Feb 23 07:02:24 crc kubenswrapper[4626]: I0223 07:02:24.853084 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:25 crc kubenswrapper[4626]: I0223 07:02:25.388545 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:25 crc kubenswrapper[4626]: I0223 07:02:25.462622 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerStarted","Data":"f1d18c1f8e2e9650cc3ce3da275856550850f07df89f1fda159381b1db61346b"} Feb 23 07:02:25 crc kubenswrapper[4626]: I0223 07:02:25.960351 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:25 crc kubenswrapper[4626]: I0223 07:02:25.994626 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f993751e-c4d5-468d-bfb3-1a0d854926c6" path="/var/lib/kubelet/pods/f993751e-c4d5-468d-bfb3-1a0d854926c6/volumes" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.163709 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52ad1719-e334-43ac-ab3e-e36be1e23fb3-logs\") pod \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.164740 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfhx4\" (UniqueName: \"kubernetes.io/projected/52ad1719-e334-43ac-ab3e-e36be1e23fb3-kube-api-access-jfhx4\") pod \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.165327 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-combined-ca-bundle\") pod \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.164649 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52ad1719-e334-43ac-ab3e-e36be1e23fb3-logs" (OuterVolumeSpecName: "logs") pod "52ad1719-e334-43ac-ab3e-e36be1e23fb3" (UID: "52ad1719-e334-43ac-ab3e-e36be1e23fb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.167410 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-config-data\") pod \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\" (UID: \"52ad1719-e334-43ac-ab3e-e36be1e23fb3\") " Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.168174 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52ad1719-e334-43ac-ab3e-e36be1e23fb3-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.169032 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ad1719-e334-43ac-ab3e-e36be1e23fb3-kube-api-access-jfhx4" (OuterVolumeSpecName: "kube-api-access-jfhx4") pod "52ad1719-e334-43ac-ab3e-e36be1e23fb3" (UID: "52ad1719-e334-43ac-ab3e-e36be1e23fb3"). InnerVolumeSpecName "kube-api-access-jfhx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.195645 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-config-data" (OuterVolumeSpecName: "config-data") pod "52ad1719-e334-43ac-ab3e-e36be1e23fb3" (UID: "52ad1719-e334-43ac-ab3e-e36be1e23fb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.211684 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52ad1719-e334-43ac-ab3e-e36be1e23fb3" (UID: "52ad1719-e334-43ac-ab3e-e36be1e23fb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.273151 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.273207 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ad1719-e334-43ac-ab3e-e36be1e23fb3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.273221 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfhx4\" (UniqueName: \"kubernetes.io/projected/52ad1719-e334-43ac-ab3e-e36be1e23fb3-kube-api-access-jfhx4\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.450074 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.450740 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" containerName="kube-state-metrics" containerID="cri-o://91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d" gracePeriod=30 Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.481915 4626 generic.go:334] "Generic (PLEG): container finished" podID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerID="37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba" exitCode=0 Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.481995 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52ad1719-e334-43ac-ab3e-e36be1e23fb3","Type":"ContainerDied","Data":"37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba"} Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.482029 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52ad1719-e334-43ac-ab3e-e36be1e23fb3","Type":"ContainerDied","Data":"8b1e8ea8a54ebed97158ba7b2675a3ff9c8eda0e8692e2c4288c7ef9c774bfc8"} Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.482050 4626 scope.go:117] "RemoveContainer" containerID="37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.482208 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.492687 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerStarted","Data":"c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac"} Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.542228 4626 scope.go:117] "RemoveContainer" containerID="0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.557791 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.571005 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.611565 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:26 crc kubenswrapper[4626]: E0223 07:02:26.612266 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-log" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.612295 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-log" Feb 23 07:02:26 crc kubenswrapper[4626]: E0223 07:02:26.612306 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-api" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.612317 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-api" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.612675 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-log" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.612706 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" containerName="nova-api-api" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.614215 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.614366 4626 scope.go:117] "RemoveContainer" containerID="37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.620209 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.620464 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.620655 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 07:02:26 crc kubenswrapper[4626]: E0223 07:02:26.626046 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba\": container with ID starting with 37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba not found: ID does not exist" containerID="37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.626087 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba"} err="failed to get container status \"37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba\": rpc error: code = NotFound desc = could not find container \"37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba\": container with ID starting with 37f92faed7e007d3a12e1f095a0f9ca3451fa3cb63130cad0443018e4c0263ba not found: ID does not exist" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.626120 4626 scope.go:117] "RemoveContainer" containerID="0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.639031 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:26 crc kubenswrapper[4626]: E0223 07:02:26.643579 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457\": container with ID starting with 0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457 not found: ID does not exist" containerID="0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.643643 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457"} err="failed to get container status \"0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457\": rpc error: code = NotFound desc = could not find container \"0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457\": container with ID starting with 0903d3844c65789d712e200a194b59d02a02237adba651f7fccd18c399bb5457 not found: ID does not exist" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.693993 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98136ef3-e347-4900-84df-0c16ce4b15b7-logs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.694065 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-config-data\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.694200 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.694236 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.694280 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-public-tls-certs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.694414 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwzp\" (UniqueName: \"kubernetes.io/projected/98136ef3-e347-4900-84df-0c16ce4b15b7-kube-api-access-htwzp\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.796920 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98136ef3-e347-4900-84df-0c16ce4b15b7-logs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.797072 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-config-data\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.797158 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.797192 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.797244 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-public-tls-certs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.797420 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwzp\" (UniqueName: \"kubernetes.io/projected/98136ef3-e347-4900-84df-0c16ce4b15b7-kube-api-access-htwzp\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.797423 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98136ef3-e347-4900-84df-0c16ce4b15b7-logs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.805800 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.806198 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.806740 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-public-tls-certs\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.808117 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-config-data\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.827766 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwzp\" (UniqueName: \"kubernetes.io/projected/98136ef3-e347-4900-84df-0c16ce4b15b7-kube-api-access-htwzp\") pod \"nova-api-0\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " pod="openstack/nova-api-0" Feb 23 07:02:26 crc kubenswrapper[4626]: I0223 07:02:26.989335 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.019622 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.023420 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.047094 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.047135 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.111552 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrlw7\" (UniqueName: \"kubernetes.io/projected/c8ff8af3-00d8-482b-9054-fa9b2f3bc766-kube-api-access-lrlw7\") pod \"c8ff8af3-00d8-482b-9054-fa9b2f3bc766\" (UID: \"c8ff8af3-00d8-482b-9054-fa9b2f3bc766\") " Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.122593 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ff8af3-00d8-482b-9054-fa9b2f3bc766-kube-api-access-lrlw7" (OuterVolumeSpecName: "kube-api-access-lrlw7") pod "c8ff8af3-00d8-482b-9054-fa9b2f3bc766" (UID: "c8ff8af3-00d8-482b-9054-fa9b2f3bc766"). InnerVolumeSpecName "kube-api-access-lrlw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.215237 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrlw7\" (UniqueName: \"kubernetes.io/projected/c8ff8af3-00d8-482b-9054-fa9b2f3bc766-kube-api-access-lrlw7\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.359678 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.507593 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerStarted","Data":"feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c"} Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.510709 4626 generic.go:334] "Generic (PLEG): container finished" podID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" containerID="91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d" exitCode=2 Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.512011 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.519089 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8ff8af3-00d8-482b-9054-fa9b2f3bc766","Type":"ContainerDied","Data":"91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d"} Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.519117 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c8ff8af3-00d8-482b-9054-fa9b2f3bc766","Type":"ContainerDied","Data":"b4566f124eb2b1c79abc3b65f023ab3570795a37fbd2b8b29ed0e5d80542189d"} Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.519134 4626 scope.go:117] "RemoveContainer" containerID="91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.553915 4626 scope.go:117] "RemoveContainer" containerID="91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d" Feb 23 07:02:27 crc kubenswrapper[4626]: E0223 07:02:27.554279 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d\": container with ID starting with 91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d not found: ID does not exist" containerID="91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.554312 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d"} err="failed to get container status \"91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d\": rpc error: code = NotFound desc = could not find container \"91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d\": container with ID starting with 91aaf13aae1721e25e21bc34811478e67cd7f6bc72c9bf1a5a1c12ed1ae5d25d not found: ID does not exist" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.555100 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.556215 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.592366 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.643545 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:02:27 crc kubenswrapper[4626]: E0223 07:02:27.644299 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" containerName="kube-state-metrics" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.644313 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" containerName="kube-state-metrics" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.644654 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" containerName="kube-state-metrics" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.645678 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.651705 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.652639 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.729221 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.747647 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.748031 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.748099 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgb9\" (UniqueName: \"kubernetes.io/projected/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-api-access-4zgb9\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.748184 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.764550 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.852533 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tz4lq"] Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.854243 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.858242 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.858409 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.859864 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.860182 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.860399 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgb9\" (UniqueName: \"kubernetes.io/projected/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-api-access-4zgb9\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.860678 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.867809 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tz4lq"] Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.879782 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.880560 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.881350 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.890168 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgb9\" (UniqueName: \"kubernetes.io/projected/6b6610d6-48cf-4f86-ac4d-603b4bb60f04-kube-api-access-4zgb9\") pod \"kube-state-metrics-0\" (UID: \"6b6610d6-48cf-4f86-ac4d-603b4bb60f04\") " pod="openstack/kube-state-metrics-0" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.962121 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-scripts\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.962185 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-config-data\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.962217 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfslt\" (UniqueName: \"kubernetes.io/projected/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-kube-api-access-dfslt\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:27 crc kubenswrapper[4626]: I0223 07:02:27.962274 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.026137 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.034698 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ad1719-e334-43ac-ab3e-e36be1e23fb3" path="/var/lib/kubelet/pods/52ad1719-e334-43ac-ab3e-e36be1e23fb3/volumes" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.035353 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ff8af3-00d8-482b-9054-fa9b2f3bc766" path="/var/lib/kubelet/pods/c8ff8af3-00d8-482b-9054-fa9b2f3bc766/volumes" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.064652 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.064784 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-scripts\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.064862 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-config-data\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.064909 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfslt\" (UniqueName: \"kubernetes.io/projected/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-kube-api-access-dfslt\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.077214 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-config-data\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.077343 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.080895 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-scripts\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.081119 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfslt\" (UniqueName: \"kubernetes.io/projected/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-kube-api-access-dfslt\") pod \"nova-cell1-cell-mapping-tz4lq\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.095857 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.096259 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.280342 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.527095 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98136ef3-e347-4900-84df-0c16ce4b15b7","Type":"ContainerStarted","Data":"e72b0bb6361d66bee95a2b6a7f814fe9dbac375543f73e1cae0d70da0d180778"} Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.527179 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98136ef3-e347-4900-84df-0c16ce4b15b7","Type":"ContainerStarted","Data":"f3b5a4f5369e106e0f532949a0068de226db839ba6a0020655b0dcd8b6c3d809"} Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.527192 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98136ef3-e347-4900-84df-0c16ce4b15b7","Type":"ContainerStarted","Data":"aedbac09d89c5103ce166cc5de9c84fc1e263e2507f390933b5f8b5bb91af85e"} Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.533470 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerStarted","Data":"9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0"} Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.546590 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.546555174 podStartE2EDuration="2.546555174s" podCreationTimestamp="2026-02-23 07:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:28.543839214 +0000 UTC m=+1300.883168469" watchObservedRunningTime="2026-02-23 07:02:28.546555174 +0000 UTC m=+1300.885884441" Feb 23 07:02:28 crc kubenswrapper[4626]: I0223 07:02:28.621679 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.057481 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tz4lq"] Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.543835 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b6610d6-48cf-4f86-ac4d-603b4bb60f04","Type":"ContainerStarted","Data":"55ae8c384149740c13650a108e537b6f2a9a4ad337310399ea312e02750bfdca"} Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.544159 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b6610d6-48cf-4f86-ac4d-603b4bb60f04","Type":"ContainerStarted","Data":"ce383267991263052e82c726c921d1e0f339954158e8d73911658e20a66b3376"} Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.545660 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.548140 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tz4lq" event={"ID":"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59","Type":"ContainerStarted","Data":"3a8427c9ef8b10ef11481a165e8a99279a2aa7a0bd65951b9a942131c66f8cf3"} Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.548165 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tz4lq" event={"ID":"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59","Type":"ContainerStarted","Data":"b578b64147da72efa718b255b98cae9d3e0bebf9185bb6dd33fcbe19efaa94c8"} Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.611246 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.297040346 podStartE2EDuration="2.611215439s" podCreationTimestamp="2026-02-23 07:02:27 +0000 UTC" firstStartedPulling="2026-02-23 07:02:28.666435962 +0000 UTC m=+1301.005765228" lastFinishedPulling="2026-02-23 07:02:28.980611054 +0000 UTC m=+1301.319940321" observedRunningTime="2026-02-23 07:02:29.563847574 +0000 UTC m=+1301.903176840" watchObservedRunningTime="2026-02-23 07:02:29.611215439 +0000 UTC m=+1301.950544705" Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.636505 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tz4lq" podStartSLOduration=2.636474331 podStartE2EDuration="2.636474331s" podCreationTimestamp="2026-02-23 07:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:29.606219951 +0000 UTC m=+1301.945549217" watchObservedRunningTime="2026-02-23 07:02:29.636474331 +0000 UTC m=+1301.975803596" Feb 23 07:02:29 crc kubenswrapper[4626]: I0223 07:02:29.892083 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.014047 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.042895 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5cc5d4d5-zchg2"] Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.043229 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="dnsmasq-dns" containerID="cri-o://a51ba017194a177695e0376a2dddd9cba0707274b5d33308278f93cc1a93a616" gracePeriod=10 Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.176853 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.204:5353: connect: connection refused" Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.558556 4626 generic.go:334] "Generic (PLEG): container finished" podID="f0dbf362-e5f3-461b-8641-9e043f490538" containerID="a51ba017194a177695e0376a2dddd9cba0707274b5d33308278f93cc1a93a616" exitCode=0 Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.558614 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" event={"ID":"f0dbf362-e5f3-461b-8641-9e043f490538","Type":"ContainerDied","Data":"a51ba017194a177695e0376a2dddd9cba0707274b5d33308278f93cc1a93a616"} Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.565233 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerStarted","Data":"d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0"} Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.565686 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:02:30 crc kubenswrapper[4626]: I0223 07:02:30.584124 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.423071849 podStartE2EDuration="6.584114181s" podCreationTimestamp="2026-02-23 07:02:24 +0000 UTC" firstStartedPulling="2026-02-23 07:02:25.419048987 +0000 UTC m=+1297.758378242" lastFinishedPulling="2026-02-23 07:02:29.580091319 +0000 UTC m=+1301.919420574" observedRunningTime="2026-02-23 07:02:30.579933969 +0000 UTC m=+1302.919263235" watchObservedRunningTime="2026-02-23 07:02:30.584114181 +0000 UTC m=+1302.923443447" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.110695 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.256226 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-nb\") pod \"f0dbf362-e5f3-461b-8641-9e043f490538\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.256275 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-config\") pod \"f0dbf362-e5f3-461b-8641-9e043f490538\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.256514 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-swift-storage-0\") pod \"f0dbf362-e5f3-461b-8641-9e043f490538\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.256771 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-svc\") pod \"f0dbf362-e5f3-461b-8641-9e043f490538\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.256819 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-sb\") pod \"f0dbf362-e5f3-461b-8641-9e043f490538\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.256906 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqsdm\" (UniqueName: \"kubernetes.io/projected/f0dbf362-e5f3-461b-8641-9e043f490538-kube-api-access-mqsdm\") pod \"f0dbf362-e5f3-461b-8641-9e043f490538\" (UID: \"f0dbf362-e5f3-461b-8641-9e043f490538\") " Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.271187 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0dbf362-e5f3-461b-8641-9e043f490538-kube-api-access-mqsdm" (OuterVolumeSpecName: "kube-api-access-mqsdm") pod "f0dbf362-e5f3-461b-8641-9e043f490538" (UID: "f0dbf362-e5f3-461b-8641-9e043f490538"). InnerVolumeSpecName "kube-api-access-mqsdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.320050 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0dbf362-e5f3-461b-8641-9e043f490538" (UID: "f0dbf362-e5f3-461b-8641-9e043f490538"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.330510 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0dbf362-e5f3-461b-8641-9e043f490538" (UID: "f0dbf362-e5f3-461b-8641-9e043f490538"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.338510 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-config" (OuterVolumeSpecName: "config") pod "f0dbf362-e5f3-461b-8641-9e043f490538" (UID: "f0dbf362-e5f3-461b-8641-9e043f490538"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.343545 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0dbf362-e5f3-461b-8641-9e043f490538" (UID: "f0dbf362-e5f3-461b-8641-9e043f490538"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.356936 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0dbf362-e5f3-461b-8641-9e043f490538" (UID: "f0dbf362-e5f3-461b-8641-9e043f490538"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.372430 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.372462 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.372480 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqsdm\" (UniqueName: \"kubernetes.io/projected/f0dbf362-e5f3-461b-8641-9e043f490538-kube-api-access-mqsdm\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.372490 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.372514 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.372527 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0dbf362-e5f3-461b-8641-9e043f490538-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.578421 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" event={"ID":"f0dbf362-e5f3-461b-8641-9e043f490538","Type":"ContainerDied","Data":"b7d30e5f5e9a803558680d905021c2279ba60dd389e7e37b538b5520c31cad2d"} Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.578867 4626 scope.go:117] "RemoveContainer" containerID="a51ba017194a177695e0376a2dddd9cba0707274b5d33308278f93cc1a93a616" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.578928 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-central-agent" containerID="cri-o://c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac" gracePeriod=30 Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.578466 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5cc5d4d5-zchg2" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.579227 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="proxy-httpd" containerID="cri-o://d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0" gracePeriod=30 Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.579283 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="sg-core" containerID="cri-o://9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0" gracePeriod=30 Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.579332 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-notification-agent" containerID="cri-o://feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c" gracePeriod=30 Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.630864 4626 scope.go:117] "RemoveContainer" containerID="705e17a56d88e62f86c0ed4ace4de233cf49be755d1689438abfd4da428c642c" Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.637244 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5cc5d4d5-zchg2"] Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.652871 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5cc5d4d5-zchg2"] Feb 23 07:02:31 crc kubenswrapper[4626]: I0223 07:02:31.996727 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" path="/var/lib/kubelet/pods/f0dbf362-e5f3-461b-8641-9e043f490538/volumes" Feb 23 07:02:32 crc kubenswrapper[4626]: I0223 07:02:32.590938 4626 generic.go:334] "Generic (PLEG): container finished" podID="0ec5912f-17d6-485b-acf4-d5017b785561" containerID="d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0" exitCode=0 Feb 23 07:02:32 crc kubenswrapper[4626]: I0223 07:02:32.590984 4626 generic.go:334] "Generic (PLEG): container finished" podID="0ec5912f-17d6-485b-acf4-d5017b785561" containerID="9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0" exitCode=2 Feb 23 07:02:32 crc kubenswrapper[4626]: I0223 07:02:32.590994 4626 generic.go:334] "Generic (PLEG): container finished" podID="0ec5912f-17d6-485b-acf4-d5017b785561" containerID="feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c" exitCode=0 Feb 23 07:02:32 crc kubenswrapper[4626]: I0223 07:02:32.591022 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerDied","Data":"d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0"} Feb 23 07:02:32 crc kubenswrapper[4626]: I0223 07:02:32.591090 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerDied","Data":"9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0"} Feb 23 07:02:32 crc kubenswrapper[4626]: I0223 07:02:32.591102 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerDied","Data":"feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c"} Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.282220 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.470102 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q9jk\" (UniqueName: \"kubernetes.io/projected/0ec5912f-17d6-485b-acf4-d5017b785561-kube-api-access-8q9jk\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.470574 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-sg-core-conf-yaml\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.470788 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-run-httpd\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.470876 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-scripts\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.471031 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-log-httpd\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.471109 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-config-data\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.471172 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.471195 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-combined-ca-bundle\") pod \"0ec5912f-17d6-485b-acf4-d5017b785561\" (UID: \"0ec5912f-17d6-485b-acf4-d5017b785561\") " Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.471475 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.473140 4626 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.473168 4626 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ec5912f-17d6-485b-acf4-d5017b785561-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.477528 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-scripts" (OuterVolumeSpecName: "scripts") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.477595 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec5912f-17d6-485b-acf4-d5017b785561-kube-api-access-8q9jk" (OuterVolumeSpecName: "kube-api-access-8q9jk") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "kube-api-access-8q9jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.501134 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.539829 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.561117 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-config-data" (OuterVolumeSpecName: "config-data") pod "0ec5912f-17d6-485b-acf4-d5017b785561" (UID: "0ec5912f-17d6-485b-acf4-d5017b785561"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.576729 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.576764 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.576776 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.576788 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q9jk\" (UniqueName: \"kubernetes.io/projected/0ec5912f-17d6-485b-acf4-d5017b785561-kube-api-access-8q9jk\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.576800 4626 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ec5912f-17d6-485b-acf4-d5017b785561-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.623050 4626 generic.go:334] "Generic (PLEG): container finished" podID="da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" containerID="3a8427c9ef8b10ef11481a165e8a99279a2aa7a0bd65951b9a942131c66f8cf3" exitCode=0 Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.623158 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tz4lq" event={"ID":"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59","Type":"ContainerDied","Data":"3a8427c9ef8b10ef11481a165e8a99279a2aa7a0bd65951b9a942131c66f8cf3"} Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.625841 4626 generic.go:334] "Generic (PLEG): container finished" podID="0ec5912f-17d6-485b-acf4-d5017b785561" containerID="c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac" exitCode=0 Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.625928 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.625957 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerDied","Data":"c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac"} Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.626177 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ec5912f-17d6-485b-acf4-d5017b785561","Type":"ContainerDied","Data":"f1d18c1f8e2e9650cc3ce3da275856550850f07df89f1fda159381b1db61346b"} Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.626207 4626 scope.go:117] "RemoveContainer" containerID="d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.659982 4626 scope.go:117] "RemoveContainer" containerID="9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.672912 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.677930 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.679896 4626 scope.go:117] "RemoveContainer" containerID="feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.694900 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.695343 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="proxy-httpd" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695355 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="proxy-httpd" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.695373 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="dnsmasq-dns" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695378 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="dnsmasq-dns" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.695398 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="sg-core" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695403 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="sg-core" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.695430 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="init" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695435 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="init" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.695443 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-central-agent" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695448 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-central-agent" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.695460 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-notification-agent" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695465 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-notification-agent" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695642 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-notification-agent" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695656 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0dbf362-e5f3-461b-8641-9e043f490538" containerName="dnsmasq-dns" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695665 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="proxy-httpd" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695676 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="sg-core" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.695686 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" containerName="ceilometer-central-agent" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.697240 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.699682 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.701162 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.701365 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.716685 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.732060 4626 scope.go:117] "RemoveContainer" containerID="c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.751045 4626 scope.go:117] "RemoveContainer" containerID="d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.751422 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0\": container with ID starting with d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0 not found: ID does not exist" containerID="d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.751510 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0"} err="failed to get container status \"d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0\": rpc error: code = NotFound desc = could not find container \"d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0\": container with ID starting with d250b9002794d8ff43dd5de3d19bc23dcc80916773ef60d19ef163839ff36ba0 not found: ID does not exist" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.751551 4626 scope.go:117] "RemoveContainer" containerID="9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.752010 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0\": container with ID starting with 9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0 not found: ID does not exist" containerID="9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.752043 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0"} err="failed to get container status \"9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0\": rpc error: code = NotFound desc = could not find container \"9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0\": container with ID starting with 9135535e5d4242c9cfa2a794ff2b24a4accab0d5344adcf91bc1066303c6a2c0 not found: ID does not exist" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.752072 4626 scope.go:117] "RemoveContainer" containerID="feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.752429 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c\": container with ID starting with feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c not found: ID does not exist" containerID="feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.752460 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c"} err="failed to get container status \"feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c\": rpc error: code = NotFound desc = could not find container \"feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c\": container with ID starting with feeb57f07b12078cc77730b1b0faf2488215a6b217f56401f54f72edb3189e0c not found: ID does not exist" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.752477 4626 scope.go:117] "RemoveContainer" containerID="c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac" Feb 23 07:02:35 crc kubenswrapper[4626]: E0223 07:02:35.752840 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac\": container with ID starting with c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac not found: ID does not exist" containerID="c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.752875 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac"} err="failed to get container status \"c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac\": rpc error: code = NotFound desc = could not find container \"c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac\": container with ID starting with c725faa4a230db1ea077d84c34035f88f6985c2d0fe40d8e6e6f183e1b3167ac not found: ID does not exist" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.884683 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9b9cc07-9e39-487b-85af-eaeaae575087-run-httpd\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.885652 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-config-data\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.885824 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9c96\" (UniqueName: \"kubernetes.io/projected/d9b9cc07-9e39-487b-85af-eaeaae575087-kube-api-access-r9c96\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.885922 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.885965 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-scripts\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.886037 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.886087 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9b9cc07-9e39-487b-85af-eaeaae575087-log-httpd\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.886193 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989126 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-config-data\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989260 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9c96\" (UniqueName: \"kubernetes.io/projected/d9b9cc07-9e39-487b-85af-eaeaae575087-kube-api-access-r9c96\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989320 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-scripts\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989341 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989422 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989473 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9b9cc07-9e39-487b-85af-eaeaae575087-log-httpd\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989664 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.989706 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9b9cc07-9e39-487b-85af-eaeaae575087-run-httpd\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.990349 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9b9cc07-9e39-487b-85af-eaeaae575087-run-httpd\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.990475 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9b9cc07-9e39-487b-85af-eaeaae575087-log-httpd\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.995365 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-config-data\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.995977 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.996158 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-scripts\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.996758 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:35 crc kubenswrapper[4626]: I0223 07:02:35.997738 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec5912f-17d6-485b-acf4-d5017b785561" path="/var/lib/kubelet/pods/0ec5912f-17d6-485b-acf4-d5017b785561/volumes" Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.002155 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b9cc07-9e39-487b-85af-eaeaae575087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.011558 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9c96\" (UniqueName: \"kubernetes.io/projected/d9b9cc07-9e39-487b-85af-eaeaae575087-kube-api-access-r9c96\") pod \"ceilometer-0\" (UID: \"d9b9cc07-9e39-487b-85af-eaeaae575087\") " pod="openstack/ceilometer-0" Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.014740 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.489800 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.637477 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9b9cc07-9e39-487b-85af-eaeaae575087","Type":"ContainerStarted","Data":"e9e1622bc7ad20fd8063273a46b2e7821db0c9934fc03dd155cb9d05c9ef57a7"} Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.939441 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.991533 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:02:36 crc kubenswrapper[4626]: I0223 07:02:36.991569 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.026269 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfslt\" (UniqueName: \"kubernetes.io/projected/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-kube-api-access-dfslt\") pod \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.026528 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-config-data\") pod \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.026827 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-scripts\") pod \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.026897 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-combined-ca-bundle\") pod \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\" (UID: \"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59\") " Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.031689 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-kube-api-access-dfslt" (OuterVolumeSpecName: "kube-api-access-dfslt") pod "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" (UID: "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59"). InnerVolumeSpecName "kube-api-access-dfslt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.037602 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-scripts" (OuterVolumeSpecName: "scripts") pod "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" (UID: "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.052526 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-config-data" (OuterVolumeSpecName: "config-data") pod "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" (UID: "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.058846 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.059011 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.065359 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" (UID: "da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.067702 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.070043 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.130056 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfslt\" (UniqueName: \"kubernetes.io/projected/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-kube-api-access-dfslt\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.130309 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.130317 4626 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.130325 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.648287 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9b9cc07-9e39-487b-85af-eaeaae575087","Type":"ContainerStarted","Data":"7ce12ac5b8122f94eec55f6a2306539c189f4b1e11e04a75caed6b5400ebc9c5"} Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.650559 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tz4lq" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.653677 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tz4lq" event={"ID":"da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59","Type":"ContainerDied","Data":"b578b64147da72efa718b255b98cae9d3e0bebf9185bb6dd33fcbe19efaa94c8"} Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.653728 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b578b64147da72efa718b255b98cae9d3e0bebf9185bb6dd33fcbe19efaa94c8" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.854648 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.855149 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-log" containerID="cri-o://f3b5a4f5369e106e0f532949a0068de226db839ba6a0020655b0dcd8b6c3d809" gracePeriod=30 Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.855302 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-api" containerID="cri-o://e72b0bb6361d66bee95a2b6a7f814fe9dbac375543f73e1cae0d70da0d180778" gracePeriod=30 Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.886720 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": EOF" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.886903 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": EOF" Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.887690 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.887888 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" containerName="nova-scheduler-scheduler" containerID="cri-o://dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" gracePeriod=30 Feb 23 07:02:37 crc kubenswrapper[4626]: I0223 07:02:37.954977 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:38 crc kubenswrapper[4626]: I0223 07:02:38.054614 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 07:02:38 crc kubenswrapper[4626]: E0223 07:02:38.581711 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:02:38 crc kubenswrapper[4626]: E0223 07:02:38.583685 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:02:38 crc kubenswrapper[4626]: E0223 07:02:38.584855 4626 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:02:38 crc kubenswrapper[4626]: E0223 07:02:38.584902 4626 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" containerName="nova-scheduler-scheduler" Feb 23 07:02:38 crc kubenswrapper[4626]: I0223 07:02:38.707904 4626 generic.go:334] "Generic (PLEG): container finished" podID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerID="f3b5a4f5369e106e0f532949a0068de226db839ba6a0020655b0dcd8b6c3d809" exitCode=143 Feb 23 07:02:38 crc kubenswrapper[4626]: I0223 07:02:38.708016 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98136ef3-e347-4900-84df-0c16ce4b15b7","Type":"ContainerDied","Data":"f3b5a4f5369e106e0f532949a0068de226db839ba6a0020655b0dcd8b6c3d809"} Feb 23 07:02:38 crc kubenswrapper[4626]: I0223 07:02:38.724261 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9b9cc07-9e39-487b-85af-eaeaae575087","Type":"ContainerStarted","Data":"3e8be67a39c5c24762736c93caa184cf67302b72cea90f4e95db5681ba509eba"} Feb 23 07:02:39 crc kubenswrapper[4626]: I0223 07:02:39.736536 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-log" containerID="cri-o://a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204" gracePeriod=30 Feb 23 07:02:39 crc kubenswrapper[4626]: I0223 07:02:39.736669 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9b9cc07-9e39-487b-85af-eaeaae575087","Type":"ContainerStarted","Data":"2ac4ca864c34ced11d167c4d7a21f0174a359870b2621bd05e940c6b0dc4191f"} Feb 23 07:02:39 crc kubenswrapper[4626]: I0223 07:02:39.737348 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-metadata" containerID="cri-o://c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361" gracePeriod=30 Feb 23 07:02:40 crc kubenswrapper[4626]: I0223 07:02:40.750550 4626 generic.go:334] "Generic (PLEG): container finished" podID="85e80536-e515-42fe-946e-efccabf4a93e" containerID="a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204" exitCode=143 Feb 23 07:02:40 crc kubenswrapper[4626]: I0223 07:02:40.750637 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"85e80536-e515-42fe-946e-efccabf4a93e","Type":"ContainerDied","Data":"a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204"} Feb 23 07:02:40 crc kubenswrapper[4626]: I0223 07:02:40.754965 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9b9cc07-9e39-487b-85af-eaeaae575087","Type":"ContainerStarted","Data":"8888c3ad3339bf934a6f611b838547f0084680e9fa9180d33db2ea9c0b495eb3"} Feb 23 07:02:40 crc kubenswrapper[4626]: I0223 07:02:40.755181 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:02:40 crc kubenswrapper[4626]: I0223 07:02:40.794184 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.009226594 podStartE2EDuration="5.794163032s" podCreationTimestamp="2026-02-23 07:02:35 +0000 UTC" firstStartedPulling="2026-02-23 07:02:36.492156713 +0000 UTC m=+1308.831485968" lastFinishedPulling="2026-02-23 07:02:40.27709314 +0000 UTC m=+1312.616422406" observedRunningTime="2026-02-23 07:02:40.774946853 +0000 UTC m=+1313.114276118" watchObservedRunningTime="2026-02-23 07:02:40.794163032 +0000 UTC m=+1313.133492298" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.595308 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.723690 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-config-data\") pod \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.723903 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cxf7\" (UniqueName: \"kubernetes.io/projected/f460fe0c-1d4d-4457-8800-6cc7d9a12279-kube-api-access-9cxf7\") pod \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.724287 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-combined-ca-bundle\") pod \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\" (UID: \"f460fe0c-1d4d-4457-8800-6cc7d9a12279\") " Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.731851 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f460fe0c-1d4d-4457-8800-6cc7d9a12279-kube-api-access-9cxf7" (OuterVolumeSpecName: "kube-api-access-9cxf7") pod "f460fe0c-1d4d-4457-8800-6cc7d9a12279" (UID: "f460fe0c-1d4d-4457-8800-6cc7d9a12279"). InnerVolumeSpecName "kube-api-access-9cxf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.752805 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-config-data" (OuterVolumeSpecName: "config-data") pod "f460fe0c-1d4d-4457-8800-6cc7d9a12279" (UID: "f460fe0c-1d4d-4457-8800-6cc7d9a12279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.760837 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f460fe0c-1d4d-4457-8800-6cc7d9a12279" (UID: "f460fe0c-1d4d-4457-8800-6cc7d9a12279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.778353 4626 generic.go:334] "Generic (PLEG): container finished" podID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" exitCode=0 Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.778433 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f460fe0c-1d4d-4457-8800-6cc7d9a12279","Type":"ContainerDied","Data":"dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62"} Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.778488 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f460fe0c-1d4d-4457-8800-6cc7d9a12279","Type":"ContainerDied","Data":"69fc982d0be94e0f8cf023bc5fda2ebcb54ddb93cac56f86a3de01ac3c1d4bca"} Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.778553 4626 scope.go:117] "RemoveContainer" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.778814 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.816041 4626 scope.go:117] "RemoveContainer" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" Feb 23 07:02:42 crc kubenswrapper[4626]: E0223 07:02:42.816691 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62\": container with ID starting with dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62 not found: ID does not exist" containerID="dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.816724 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62"} err="failed to get container status \"dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62\": rpc error: code = NotFound desc = could not find container \"dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62\": container with ID starting with dc0971e4c3932ed57b3a1655991d12d2f7457e47643f369cfdbb9d5d7f96fe62 not found: ID does not exist" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.827741 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.827769 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f460fe0c-1d4d-4457-8800-6cc7d9a12279-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.827779 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cxf7\" (UniqueName: \"kubernetes.io/projected/f460fe0c-1d4d-4457-8800-6cc7d9a12279-kube-api-access-9cxf7\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.834531 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.841425 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.858113 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:02:42 crc kubenswrapper[4626]: E0223 07:02:42.858464 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" containerName="nova-manage" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.858481 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" containerName="nova-manage" Feb 23 07:02:42 crc kubenswrapper[4626]: E0223 07:02:42.858552 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" containerName="nova-scheduler-scheduler" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.858559 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" containerName="nova-scheduler-scheduler" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.858720 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" containerName="nova-manage" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.858736 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" containerName="nova-scheduler-scheduler" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.859338 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.862069 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.875459 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.895761 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:49098->10.217.0.211:8775: read: connection reset by peer" Feb 23 07:02:42 crc kubenswrapper[4626]: I0223 07:02:42.896262 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:49084->10.217.0.211:8775: read: connection reset by peer" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.032380 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcf4a83-5fb6-41df-bf75-af6298f822e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.032538 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr7p8\" (UniqueName: \"kubernetes.io/projected/8dcf4a83-5fb6-41df-bf75-af6298f822e1-kube-api-access-jr7p8\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.032626 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcf4a83-5fb6-41df-bf75-af6298f822e1-config-data\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.136461 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcf4a83-5fb6-41df-bf75-af6298f822e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.136688 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr7p8\" (UniqueName: \"kubernetes.io/projected/8dcf4a83-5fb6-41df-bf75-af6298f822e1-kube-api-access-jr7p8\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.136783 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcf4a83-5fb6-41df-bf75-af6298f822e1-config-data\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.142522 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcf4a83-5fb6-41df-bf75-af6298f822e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.145855 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcf4a83-5fb6-41df-bf75-af6298f822e1-config-data\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.158244 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr7p8\" (UniqueName: \"kubernetes.io/projected/8dcf4a83-5fb6-41df-bf75-af6298f822e1-kube-api-access-jr7p8\") pod \"nova-scheduler-0\" (UID: \"8dcf4a83-5fb6-41df-bf75-af6298f822e1\") " pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.178280 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.464513 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.567708 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lx2l\" (UniqueName: \"kubernetes.io/projected/85e80536-e515-42fe-946e-efccabf4a93e-kube-api-access-9lx2l\") pod \"85e80536-e515-42fe-946e-efccabf4a93e\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.567808 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-nova-metadata-tls-certs\") pod \"85e80536-e515-42fe-946e-efccabf4a93e\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.567870 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-combined-ca-bundle\") pod \"85e80536-e515-42fe-946e-efccabf4a93e\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.567899 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e80536-e515-42fe-946e-efccabf4a93e-logs\") pod \"85e80536-e515-42fe-946e-efccabf4a93e\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.567937 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-config-data\") pod \"85e80536-e515-42fe-946e-efccabf4a93e\" (UID: \"85e80536-e515-42fe-946e-efccabf4a93e\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.574655 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e80536-e515-42fe-946e-efccabf4a93e-logs" (OuterVolumeSpecName: "logs") pod "85e80536-e515-42fe-946e-efccabf4a93e" (UID: "85e80536-e515-42fe-946e-efccabf4a93e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.601914 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e80536-e515-42fe-946e-efccabf4a93e-kube-api-access-9lx2l" (OuterVolumeSpecName: "kube-api-access-9lx2l") pod "85e80536-e515-42fe-946e-efccabf4a93e" (UID: "85e80536-e515-42fe-946e-efccabf4a93e"). InnerVolumeSpecName "kube-api-access-9lx2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.639580 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85e80536-e515-42fe-946e-efccabf4a93e" (UID: "85e80536-e515-42fe-946e-efccabf4a93e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.643600 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-config-data" (OuterVolumeSpecName: "config-data") pod "85e80536-e515-42fe-946e-efccabf4a93e" (UID: "85e80536-e515-42fe-946e-efccabf4a93e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.662567 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "85e80536-e515-42fe-946e-efccabf4a93e" (UID: "85e80536-e515-42fe-946e-efccabf4a93e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.662991 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.669992 4626 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.670028 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.670039 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e80536-e515-42fe-946e-efccabf4a93e-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.670047 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e80536-e515-42fe-946e-efccabf4a93e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.670057 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lx2l\" (UniqueName: \"kubernetes.io/projected/85e80536-e515-42fe-946e-efccabf4a93e-kube-api-access-9lx2l\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.839040 4626 generic.go:334] "Generic (PLEG): container finished" podID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerID="e72b0bb6361d66bee95a2b6a7f814fe9dbac375543f73e1cae0d70da0d180778" exitCode=0 Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.839149 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98136ef3-e347-4900-84df-0c16ce4b15b7","Type":"ContainerDied","Data":"e72b0bb6361d66bee95a2b6a7f814fe9dbac375543f73e1cae0d70da0d180778"} Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.847959 4626 generic.go:334] "Generic (PLEG): container finished" podID="85e80536-e515-42fe-946e-efccabf4a93e" containerID="c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361" exitCode=0 Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.848092 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"85e80536-e515-42fe-946e-efccabf4a93e","Type":"ContainerDied","Data":"c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361"} Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.848147 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"85e80536-e515-42fe-946e-efccabf4a93e","Type":"ContainerDied","Data":"02da5d487f8f6feb587954c8171db90618d0108f49a2588197a95027af16ab70"} Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.848184 4626 scope.go:117] "RemoveContainer" containerID="c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.848459 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.860065 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8dcf4a83-5fb6-41df-bf75-af6298f822e1","Type":"ContainerStarted","Data":"8f41280e8d9ab935e30a835b3c33acb1bd80788d8ff363cf3cfd3bc4a8f53daa"} Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.897274 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.914947 4626 scope.go:117] "RemoveContainer" containerID="a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.922905 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.936580 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:43 crc kubenswrapper[4626]: E0223 07:02:43.937199 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-metadata" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.937216 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-metadata" Feb 23 07:02:43 crc kubenswrapper[4626]: E0223 07:02:43.937249 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-log" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.937256 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-log" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.937476 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-log" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.937506 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e80536-e515-42fe-946e-efccabf4a93e" containerName="nova-metadata-metadata" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.938684 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.942055 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.942326 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.943714 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.955895 4626 scope.go:117] "RemoveContainer" containerID="c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361" Feb 23 07:02:43 crc kubenswrapper[4626]: E0223 07:02:43.957960 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361\": container with ID starting with c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361 not found: ID does not exist" containerID="c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.958007 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361"} err="failed to get container status \"c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361\": rpc error: code = NotFound desc = could not find container \"c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361\": container with ID starting with c3f38d838b141b4b62e8c983c7a7e08876fb857bbeccb3673fd6ec56ad6d1361 not found: ID does not exist" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.958030 4626 scope.go:117] "RemoveContainer" containerID="a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204" Feb 23 07:02:43 crc kubenswrapper[4626]: E0223 07:02:43.963688 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204\": container with ID starting with a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204 not found: ID does not exist" containerID="a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.963718 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204"} err="failed to get container status \"a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204\": rpc error: code = NotFound desc = could not find container \"a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204\": container with ID starting with a1f9e73d18dd57a24dac8ed129f65d58496b846761ab07f35533791bacea9204 not found: ID does not exist" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.964359 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.990620 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-public-tls-certs\") pod \"98136ef3-e347-4900-84df-0c16ce4b15b7\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.990726 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htwzp\" (UniqueName: \"kubernetes.io/projected/98136ef3-e347-4900-84df-0c16ce4b15b7-kube-api-access-htwzp\") pod \"98136ef3-e347-4900-84df-0c16ce4b15b7\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.990789 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98136ef3-e347-4900-84df-0c16ce4b15b7-logs\") pod \"98136ef3-e347-4900-84df-0c16ce4b15b7\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.990861 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-config-data\") pod \"98136ef3-e347-4900-84df-0c16ce4b15b7\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.990986 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-internal-tls-certs\") pod \"98136ef3-e347-4900-84df-0c16ce4b15b7\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.991071 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-combined-ca-bundle\") pod \"98136ef3-e347-4900-84df-0c16ce4b15b7\" (UID: \"98136ef3-e347-4900-84df-0c16ce4b15b7\") " Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.991435 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-config-data\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.991516 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.991560 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.991601 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzhf\" (UniqueName: \"kubernetes.io/projected/f00ecb25-d721-43a6-810e-976c03a0572d-kube-api-access-mlzhf\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.997641 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00ecb25-d721-43a6-810e-976c03a0572d-logs\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:43 crc kubenswrapper[4626]: I0223 07:02:43.998157 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e80536-e515-42fe-946e-efccabf4a93e" path="/var/lib/kubelet/pods/85e80536-e515-42fe-946e-efccabf4a93e/volumes" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:43.998908 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f460fe0c-1d4d-4457-8800-6cc7d9a12279" path="/var/lib/kubelet/pods/f460fe0c-1d4d-4457-8800-6cc7d9a12279/volumes" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.005308 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98136ef3-e347-4900-84df-0c16ce4b15b7-logs" (OuterVolumeSpecName: "logs") pod "98136ef3-e347-4900-84df-0c16ce4b15b7" (UID: "98136ef3-e347-4900-84df-0c16ce4b15b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.013891 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98136ef3-e347-4900-84df-0c16ce4b15b7-kube-api-access-htwzp" (OuterVolumeSpecName: "kube-api-access-htwzp") pod "98136ef3-e347-4900-84df-0c16ce4b15b7" (UID: "98136ef3-e347-4900-84df-0c16ce4b15b7"). InnerVolumeSpecName "kube-api-access-htwzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.057644 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-config-data" (OuterVolumeSpecName: "config-data") pod "98136ef3-e347-4900-84df-0c16ce4b15b7" (UID: "98136ef3-e347-4900-84df-0c16ce4b15b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.064693 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "98136ef3-e347-4900-84df-0c16ce4b15b7" (UID: "98136ef3-e347-4900-84df-0c16ce4b15b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.068725 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "98136ef3-e347-4900-84df-0c16ce4b15b7" (UID: "98136ef3-e347-4900-84df-0c16ce4b15b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.069124 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98136ef3-e347-4900-84df-0c16ce4b15b7" (UID: "98136ef3-e347-4900-84df-0c16ce4b15b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.099929 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100046 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzhf\" (UniqueName: \"kubernetes.io/projected/f00ecb25-d721-43a6-810e-976c03a0572d-kube-api-access-mlzhf\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100184 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00ecb25-d721-43a6-810e-976c03a0572d-logs\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100340 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-config-data\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100470 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100583 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100601 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100612 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100624 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htwzp\" (UniqueName: \"kubernetes.io/projected/98136ef3-e347-4900-84df-0c16ce4b15b7-kube-api-access-htwzp\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100635 4626 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98136ef3-e347-4900-84df-0c16ce4b15b7-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.100677 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98136ef3-e347-4900-84df-0c16ce4b15b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.101301 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f00ecb25-d721-43a6-810e-976c03a0572d-logs\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.105042 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.105410 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.106597 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00ecb25-d721-43a6-810e-976c03a0572d-config-data\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.114815 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzhf\" (UniqueName: \"kubernetes.io/projected/f00ecb25-d721-43a6-810e-976c03a0572d-kube-api-access-mlzhf\") pod \"nova-metadata-0\" (UID: \"f00ecb25-d721-43a6-810e-976c03a0572d\") " pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.260754 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.877610 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8dcf4a83-5fb6-41df-bf75-af6298f822e1","Type":"ContainerStarted","Data":"075506b67de31646e95502c08dc010b6980820240bdc802230005882a5d44aaa"} Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.880960 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98136ef3-e347-4900-84df-0c16ce4b15b7","Type":"ContainerDied","Data":"aedbac09d89c5103ce166cc5de9c84fc1e263e2507f390933b5f8b5bb91af85e"} Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.881125 4626 scope.go:117] "RemoveContainer" containerID="e72b0bb6361d66bee95a2b6a7f814fe9dbac375543f73e1cae0d70da0d180778" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.881477 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.910838 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.910813579 podStartE2EDuration="2.910813579s" podCreationTimestamp="2026-02-23 07:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:44.896781203 +0000 UTC m=+1317.236110470" watchObservedRunningTime="2026-02-23 07:02:44.910813579 +0000 UTC m=+1317.250142845" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.916831 4626 scope.go:117] "RemoveContainer" containerID="f3b5a4f5369e106e0f532949a0068de226db839ba6a0020655b0dcd8b6c3d809" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.936087 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.944987 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.989049 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:44 crc kubenswrapper[4626]: E0223 07:02:44.990842 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-api" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.990949 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-api" Feb 23 07:02:44 crc kubenswrapper[4626]: E0223 07:02:44.990989 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-log" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.991031 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-log" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.991548 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-log" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.991577 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" containerName="nova-api-api" Feb 23 07:02:44 crc kubenswrapper[4626]: I0223 07:02:44.994600 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.020665 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.021016 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.021285 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.037981 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-public-tls-certs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.038091 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.038156 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkn5z\" (UniqueName: \"kubernetes.io/projected/e3af6bff-9179-4a10-aa63-4227c8933818-kube-api-access-gkn5z\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.038219 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.038656 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3af6bff-9179-4a10-aa63-4227c8933818-logs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.038815 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-config-data\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.072591 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.101566 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.145695 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-public-tls-certs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.146033 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.146242 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkn5z\" (UniqueName: \"kubernetes.io/projected/e3af6bff-9179-4a10-aa63-4227c8933818-kube-api-access-gkn5z\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.146465 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.146864 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3af6bff-9179-4a10-aa63-4227c8933818-logs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.147179 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-config-data\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.147468 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3af6bff-9179-4a10-aa63-4227c8933818-logs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.153294 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-config-data\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.153928 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-public-tls-certs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.155595 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.161143 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3af6bff-9179-4a10-aa63-4227c8933818-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.163430 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkn5z\" (UniqueName: \"kubernetes.io/projected/e3af6bff-9179-4a10-aa63-4227c8933818-kube-api-access-gkn5z\") pod \"nova-api-0\" (UID: \"e3af6bff-9179-4a10-aa63-4227c8933818\") " pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.343940 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.781990 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:02:45 crc kubenswrapper[4626]: W0223 07:02:45.783337 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3af6bff_9179_4a10_aa63_4227c8933818.slice/crio-c0a401ab7c0ffcdab9ad1061edd16c14480959f22522a76cf7603012f7c98486 WatchSource:0}: Error finding container c0a401ab7c0ffcdab9ad1061edd16c14480959f22522a76cf7603012f7c98486: Status 404 returned error can't find the container with id c0a401ab7c0ffcdab9ad1061edd16c14480959f22522a76cf7603012f7c98486 Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.895880 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f00ecb25-d721-43a6-810e-976c03a0572d","Type":"ContainerStarted","Data":"0463efe1979266fee6847bb7e8f6df8c05078e47f9183f9b8932b6000037a735"} Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.895947 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f00ecb25-d721-43a6-810e-976c03a0572d","Type":"ContainerStarted","Data":"33b1f532a5413a84ccc43a217bccfa1d675129619907ce706a8beeadd4993ef7"} Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.895961 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f00ecb25-d721-43a6-810e-976c03a0572d","Type":"ContainerStarted","Data":"002bb825666deb7b5ab048a754bb1f1883129c6917a15d2cd2b0e6cb0ec70182"} Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.901188 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3af6bff-9179-4a10-aa63-4227c8933818","Type":"ContainerStarted","Data":"c0a401ab7c0ffcdab9ad1061edd16c14480959f22522a76cf7603012f7c98486"} Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.921331 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.921305278 podStartE2EDuration="2.921305278s" podCreationTimestamp="2026-02-23 07:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:45.91715894 +0000 UTC m=+1318.256488197" watchObservedRunningTime="2026-02-23 07:02:45.921305278 +0000 UTC m=+1318.260634544" Feb 23 07:02:45 crc kubenswrapper[4626]: I0223 07:02:45.997421 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98136ef3-e347-4900-84df-0c16ce4b15b7" path="/var/lib/kubelet/pods/98136ef3-e347-4900-84df-0c16ce4b15b7/volumes" Feb 23 07:02:46 crc kubenswrapper[4626]: I0223 07:02:46.919966 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3af6bff-9179-4a10-aa63-4227c8933818","Type":"ContainerStarted","Data":"f8b6aaec584347751e1a125411d917956cdf7e7c9c9cf29df139c044a794d681"} Feb 23 07:02:46 crc kubenswrapper[4626]: I0223 07:02:46.920287 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3af6bff-9179-4a10-aa63-4227c8933818","Type":"ContainerStarted","Data":"5e4e7aeb47a78690b1c879197a605e6c39352cd44662748f27302f8b0c632fa2"} Feb 23 07:02:46 crc kubenswrapper[4626]: I0223 07:02:46.945323 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.945305894 podStartE2EDuration="2.945305894s" podCreationTimestamp="2026-02-23 07:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:02:46.936639015 +0000 UTC m=+1319.275968281" watchObservedRunningTime="2026-02-23 07:02:46.945305894 +0000 UTC m=+1319.284635160" Feb 23 07:02:48 crc kubenswrapper[4626]: I0223 07:02:48.179678 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:02:49 crc kubenswrapper[4626]: I0223 07:02:49.261459 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:02:49 crc kubenswrapper[4626]: I0223 07:02:49.261828 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:02:53 crc kubenswrapper[4626]: I0223 07:02:53.179357 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:02:53 crc kubenswrapper[4626]: I0223 07:02:53.205249 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:02:54 crc kubenswrapper[4626]: I0223 07:02:54.024187 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:02:54 crc kubenswrapper[4626]: I0223 07:02:54.261341 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:02:54 crc kubenswrapper[4626]: I0223 07:02:54.261411 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:02:55 crc kubenswrapper[4626]: I0223 07:02:55.277635 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f00ecb25-d721-43a6-810e-976c03a0572d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:55 crc kubenswrapper[4626]: I0223 07:02:55.277979 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f00ecb25-d721-43a6-810e-976c03a0572d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.219:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:55 crc kubenswrapper[4626]: I0223 07:02:55.347937 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:02:55 crc kubenswrapper[4626]: I0223 07:02:55.347981 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:02:56 crc kubenswrapper[4626]: I0223 07:02:56.358639 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3af6bff-9179-4a10-aa63-4227c8933818" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:02:56 crc kubenswrapper[4626]: I0223 07:02:56.358943 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3af6bff-9179-4a10-aa63-4227c8933818" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:03:04 crc kubenswrapper[4626]: I0223 07:03:04.269469 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:03:04 crc kubenswrapper[4626]: I0223 07:03:04.271126 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:03:04 crc kubenswrapper[4626]: I0223 07:03:04.276458 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:03:05 crc kubenswrapper[4626]: I0223 07:03:05.120615 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:03:05 crc kubenswrapper[4626]: I0223 07:03:05.354904 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:03:05 crc kubenswrapper[4626]: I0223 07:03:05.356239 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:03:05 crc kubenswrapper[4626]: I0223 07:03:05.357804 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:03:05 crc kubenswrapper[4626]: I0223 07:03:05.362216 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:03:06 crc kubenswrapper[4626]: I0223 07:03:06.025732 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:03:06 crc kubenswrapper[4626]: I0223 07:03:06.127660 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:03:06 crc kubenswrapper[4626]: I0223 07:03:06.137198 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:03:14 crc kubenswrapper[4626]: I0223 07:03:14.357202 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:03:15 crc kubenswrapper[4626]: I0223 07:03:15.328082 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:03:19 crc kubenswrapper[4626]: I0223 07:03:19.809084 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerName="rabbitmq" containerID="cri-o://1e469c0ada8f0d6a21352ad236a3ffa6bfae564238ce58c5b5526e59a3c383d8" gracePeriod=604795 Feb 23 07:03:20 crc kubenswrapper[4626]: I0223 07:03:20.477047 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerName="rabbitmq" containerID="cri-o://66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039" gracePeriod=604795 Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.685960 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.686955 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.874800 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c6ffc5cc-prjn9"] Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.876368 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.878933 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.953932 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c6ffc5cc-prjn9"] Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955292 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-svc\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955478 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-config\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955568 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-sb\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955597 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-nb\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955671 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-swift-storage-0\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955786 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djx4w\" (UniqueName: \"kubernetes.io/projected/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-kube-api-access-djx4w\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:25 crc kubenswrapper[4626]: I0223 07:03:25.955828 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-openstack-edpm-ipam\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.061828 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-config\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.061910 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-sb\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.061931 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-nb\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.061985 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-swift-storage-0\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.062071 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djx4w\" (UniqueName: \"kubernetes.io/projected/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-kube-api-access-djx4w\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.062107 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-openstack-edpm-ipam\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.062169 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-svc\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.063408 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-sb\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.063451 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-openstack-edpm-ipam\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.064235 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-nb\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.064690 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-config\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.064693 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-svc\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.064935 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-swift-storage-0\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.084283 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djx4w\" (UniqueName: \"kubernetes.io/projected/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-kube-api-access-djx4w\") pod \"dnsmasq-dns-84c6ffc5cc-prjn9\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.201128 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.386103 4626 generic.go:334] "Generic (PLEG): container finished" podID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerID="1e469c0ada8f0d6a21352ad236a3ffa6bfae564238ce58c5b5526e59a3c383d8" exitCode=0 Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.386145 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb16cec1-24fc-4504-8968-0c3fb8368f27","Type":"ContainerDied","Data":"1e469c0ada8f0d6a21352ad236a3ffa6bfae564238ce58c5b5526e59a3c383d8"} Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.386173 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb16cec1-24fc-4504-8968-0c3fb8368f27","Type":"ContainerDied","Data":"07e6b3b1e7dbb1272dea499e9b8a60d0f849d9ebcd0c9c88076e33a9ec0d35cc"} Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.386184 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07e6b3b1e7dbb1272dea499e9b8a60d0f849d9ebcd0c9c88076e33a9ec0d35cc" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.415057 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.473410 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-plugins-conf\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.473516 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-server-conf\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.473572 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nrck\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-kube-api-access-9nrck\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.473861 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-plugins\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.473892 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-confd\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.473929 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.474029 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-config-data\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.474111 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb16cec1-24fc-4504-8968-0c3fb8368f27-erlang-cookie-secret\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.474136 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-tls\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.474165 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-erlang-cookie\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.474183 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb16cec1-24fc-4504-8968-0c3fb8368f27-pod-info\") pod \"cb16cec1-24fc-4504-8968-0c3fb8368f27\" (UID: \"cb16cec1-24fc-4504-8968-0c3fb8368f27\") " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.476600 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.477584 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.481248 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.484778 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-kube-api-access-9nrck" (OuterVolumeSpecName: "kube-api-access-9nrck") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "kube-api-access-9nrck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.484955 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb16cec1-24fc-4504-8968-0c3fb8368f27-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.485590 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.494807 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.495415 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cb16cec1-24fc-4504-8968-0c3fb8368f27-pod-info" (OuterVolumeSpecName: "pod-info") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.553455 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-config-data" (OuterVolumeSpecName: "config-data") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.554750 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-server-conf" (OuterVolumeSpecName: "server-conf") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580128 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580158 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580170 4626 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb16cec1-24fc-4504-8968-0c3fb8368f27-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580178 4626 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580186 4626 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580194 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nrck\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-kube-api-access-9nrck\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580202 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580224 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580233 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb16cec1-24fc-4504-8968-0c3fb8368f27-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.580241 4626 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb16cec1-24fc-4504-8968-0c3fb8368f27-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.600921 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.682831 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cb16cec1-24fc-4504-8968-0c3fb8368f27" (UID: "cb16cec1-24fc-4504-8968-0c3fb8368f27"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.683244 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb16cec1-24fc-4504-8968-0c3fb8368f27-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.683259 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:26 crc kubenswrapper[4626]: I0223 07:03:26.830320 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c6ffc5cc-prjn9"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.187191 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.299981 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-erlang-cookie-secret\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.300103 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-tls\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.300217 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-config-data\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.300272 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7fk\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-kube-api-access-lp7fk\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.300357 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.300437 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-erlang-cookie\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.300462 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-pod-info\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.301516 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-plugins-conf\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.301679 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-confd\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.301760 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-server-conf\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.301782 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-plugins\") pod \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\" (UID: \"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1\") " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.302222 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.308370 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.309869 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.309957 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.313535 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.326596 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-pod-info" (OuterVolumeSpecName: "pod-info") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.361903 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.373208 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.374278 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-kube-api-access-lp7fk" (OuterVolumeSpecName: "kube-api-access-lp7fk") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "kube-api-access-lp7fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.400317 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" event={"ID":"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb","Type":"ContainerStarted","Data":"bce1d09635e26bc1a07119dac15da893f34b516735c501b7e6830df185139b30"} Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.402471 4626 generic.go:334] "Generic (PLEG): container finished" podID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerID="66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039" exitCode=0 Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.402603 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.402672 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1","Type":"ContainerDied","Data":"66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039"} Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.402754 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1","Type":"ContainerDied","Data":"101e15c404b4c18fb56512a2757067670b794ab90b07cb9e0d900b121b695d02"} Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.402785 4626 scope.go:117] "RemoveContainer" containerID="66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.402870 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.415976 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.416009 4626 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.416024 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.416038 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7fk\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-kube-api-access-lp7fk\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.416112 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.416123 4626 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.416137 4626 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.430524 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-config-data" (OuterVolumeSpecName: "config-data") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.442385 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.445314 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.519706 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.519753 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.519764 4626 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.522062 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" (UID: "bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.585964 4626 scope.go:117] "RemoveContainer" containerID="51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.599547 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.618641 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.622037 4626 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.622581 4626 scope.go:117] "RemoveContainer" containerID="66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039" Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.623040 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039\": container with ID starting with 66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039 not found: ID does not exist" containerID="66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.623076 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039"} err="failed to get container status \"66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039\": rpc error: code = NotFound desc = could not find container \"66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039\": container with ID starting with 66e446c14d29c448bd86dddad19d109e51a42e812d724b3e895eb4c038adc039 not found: ID does not exist" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.623097 4626 scope.go:117] "RemoveContainer" containerID="51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03" Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.623462 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03\": container with ID starting with 51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03 not found: ID does not exist" containerID="51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.623482 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03"} err="failed to get container status \"51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03\": rpc error: code = NotFound desc = could not find container \"51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03\": container with ID starting with 51a27e337d7082e0a853e7506ffecaf98ae007d274fedc5a88181dc95ac5ab03 not found: ID does not exist" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.657723 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.658626 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerName="setup-container" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.658649 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerName="setup-container" Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.658659 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerName="setup-container" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.658666 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerName="setup-container" Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.658704 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerName="rabbitmq" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.658713 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerName="rabbitmq" Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.658734 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerName="rabbitmq" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.658741 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerName="rabbitmq" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.658957 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" containerName="rabbitmq" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.658977 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" containerName="rabbitmq" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.660636 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.667290 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6mg66" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.667394 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.667539 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.667675 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.667810 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.667855 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.668096 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.683884 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.727717 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.727879 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728110 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728178 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnqpv\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-kube-api-access-mnqpv\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728224 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728302 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-config-data\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728346 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/969fc15c-8289-4aa6-b590-9fa59f05783b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728391 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.728451 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.729042 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.729150 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/969fc15c-8289-4aa6-b590-9fa59f05783b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.758246 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.771053 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.790740 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.799253 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.812981 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.821700 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.821995 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lhrrt" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.822118 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.822168 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.822237 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.822292 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.822439 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832134 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832180 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832223 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832259 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832292 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832328 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d9572dd-1a1c-4261-a7ba-7538d24d769a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832366 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832409 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832456 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832478 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnqpv\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-kube-api-access-mnqpv\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832548 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832599 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832648 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-config-data\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832697 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/969fc15c-8289-4aa6-b590-9fa59f05783b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832733 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832772 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832805 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832852 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832876 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d9572dd-1a1c-4261-a7ba-7538d24d769a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.832924 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmng4\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-kube-api-access-dmng4\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.833035 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.833082 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/969fc15c-8289-4aa6-b590-9fa59f05783b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.834568 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.835273 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.836435 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.836473 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/969fc15c-8289-4aa6-b590-9fa59f05783b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.836916 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.837216 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/969fc15c-8289-4aa6-b590-9fa59f05783b-config-data\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.837979 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.848525 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/969fc15c-8289-4aa6-b590-9fa59f05783b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.859466 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: E0223 07:03:27.859677 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd0d0747_4f47_4878_b7e9_a7bff2a9ccd1.slice\": RecentStats: unable to find data in memory cache]" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.859746 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.860401 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnqpv\" (UniqueName: \"kubernetes.io/projected/969fc15c-8289-4aa6-b590-9fa59f05783b-kube-api-access-mnqpv\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.894231 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"969fc15c-8289-4aa6-b590-9fa59f05783b\") " pod="openstack/rabbitmq-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935318 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935377 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935431 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935466 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d9572dd-1a1c-4261-a7ba-7538d24d769a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935529 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935586 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935607 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935711 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935760 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935794 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d9572dd-1a1c-4261-a7ba-7538d24d769a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935829 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmng4\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-kube-api-access-dmng4\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935828 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.936523 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.936862 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.936968 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.935949 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.937180 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d9572dd-1a1c-4261-a7ba-7538d24d769a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.939873 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d9572dd-1a1c-4261-a7ba-7538d24d769a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.941627 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.945303 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d9572dd-1a1c-4261-a7ba-7538d24d769a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.945973 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.952205 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmng4\" (UniqueName: \"kubernetes.io/projected/3d9572dd-1a1c-4261-a7ba-7538d24d769a-kube-api-access-dmng4\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.973230 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d9572dd-1a1c-4261-a7ba-7538d24d769a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:27 crc kubenswrapper[4626]: I0223 07:03:27.993910 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.001412 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1" path="/var/lib/kubelet/pods/bd0d0747-4f47-4878-b7e9-a7bff2a9ccd1/volumes" Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.002440 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb16cec1-24fc-4504-8968-0c3fb8368f27" path="/var/lib/kubelet/pods/cb16cec1-24fc-4504-8968-0c3fb8368f27/volumes" Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.159857 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.435479 4626 generic.go:334] "Generic (PLEG): container finished" podID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerID="00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3" exitCode=0 Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.435845 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" event={"ID":"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb","Type":"ContainerDied","Data":"00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3"} Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.569039 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:03:28 crc kubenswrapper[4626]: I0223 07:03:28.805890 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:03:29 crc kubenswrapper[4626]: I0223 07:03:29.449338 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" event={"ID":"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb","Type":"ContainerStarted","Data":"35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501"} Feb 23 07:03:29 crc kubenswrapper[4626]: I0223 07:03:29.450986 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:29 crc kubenswrapper[4626]: I0223 07:03:29.451116 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"969fc15c-8289-4aa6-b590-9fa59f05783b","Type":"ContainerStarted","Data":"2053e0d6b79a2e6d77984fca98c556063ae1e271b1d4f7f1a4371e5707492b1f"} Feb 23 07:03:29 crc kubenswrapper[4626]: I0223 07:03:29.452173 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d9572dd-1a1c-4261-a7ba-7538d24d769a","Type":"ContainerStarted","Data":"d95469052e7746f7f4454747a537aa903510a44a565b5bb199ed61847f35028b"} Feb 23 07:03:29 crc kubenswrapper[4626]: I0223 07:03:29.474668 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" podStartSLOduration=4.474645862 podStartE2EDuration="4.474645862s" podCreationTimestamp="2026-02-23 07:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:03:29.467415799 +0000 UTC m=+1361.806745065" watchObservedRunningTime="2026-02-23 07:03:29.474645862 +0000 UTC m=+1361.813975118" Feb 23 07:03:30 crc kubenswrapper[4626]: I0223 07:03:30.463590 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d9572dd-1a1c-4261-a7ba-7538d24d769a","Type":"ContainerStarted","Data":"9e657c1f1b6db60b7a1d6f17664109ed7c6db056f3134efb3d54c990e758c7e3"} Feb 23 07:03:30 crc kubenswrapper[4626]: I0223 07:03:30.466946 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"969fc15c-8289-4aa6-b590-9fa59f05783b","Type":"ContainerStarted","Data":"d02fe842839bcf302c9ac44c0ccda89a46e05ee2c749b97d6fe9313b894b3fcb"} Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.203146 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.263267 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746ddbbc65-7nl6w"] Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.263536 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" podUID="5b99373e-a436-4150-b209-0c68797de11e" containerName="dnsmasq-dns" containerID="cri-o://75f7038c6583ef2394671fa64d9fe699d65c2b64bbcae274e4afd7764b492157" gracePeriod=10 Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.426139 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cdb55cb5c-2pfm4"] Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.428363 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.470555 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cdb55cb5c-2pfm4"] Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511191 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-dns-svc\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511280 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-dns-swift-storage-0\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511417 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-ovsdbserver-nb\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511460 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-ovsdbserver-sb\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511487 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-config\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511557 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5mm\" (UniqueName: \"kubernetes.io/projected/5f9703d1-1761-47e8-8524-c52def1bcac3-kube-api-access-nq5mm\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.511628 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-openstack-edpm-ipam\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.528030 4626 generic.go:334] "Generic (PLEG): container finished" podID="5b99373e-a436-4150-b209-0c68797de11e" containerID="75f7038c6583ef2394671fa64d9fe699d65c2b64bbcae274e4afd7764b492157" exitCode=0 Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.528097 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" event={"ID":"5b99373e-a436-4150-b209-0c68797de11e","Type":"ContainerDied","Data":"75f7038c6583ef2394671fa64d9fe699d65c2b64bbcae274e4afd7764b492157"} Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614409 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-ovsdbserver-nb\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614487 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-ovsdbserver-sb\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614542 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-config\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614606 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5mm\" (UniqueName: \"kubernetes.io/projected/5f9703d1-1761-47e8-8524-c52def1bcac3-kube-api-access-nq5mm\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614682 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-openstack-edpm-ipam\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614784 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-dns-svc\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.614847 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-dns-swift-storage-0\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.615678 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-ovsdbserver-nb\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.616052 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-config\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.616301 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-dns-svc\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.616668 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-ovsdbserver-sb\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.616875 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-openstack-edpm-ipam\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.625062 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5f9703d1-1761-47e8-8524-c52def1bcac3-dns-swift-storage-0\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.638959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5mm\" (UniqueName: \"kubernetes.io/projected/5f9703d1-1761-47e8-8524-c52def1bcac3-kube-api-access-nq5mm\") pod \"dnsmasq-dns-7cdb55cb5c-2pfm4\" (UID: \"5f9703d1-1761-47e8-8524-c52def1bcac3\") " pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.718824 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.753013 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.819194 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-config\") pod \"5b99373e-a436-4150-b209-0c68797de11e\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.819260 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7thpz\" (UniqueName: \"kubernetes.io/projected/5b99373e-a436-4150-b209-0c68797de11e-kube-api-access-7thpz\") pod \"5b99373e-a436-4150-b209-0c68797de11e\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.819301 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-svc\") pod \"5b99373e-a436-4150-b209-0c68797de11e\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.819402 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-sb\") pod \"5b99373e-a436-4150-b209-0c68797de11e\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.819444 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-nb\") pod \"5b99373e-a436-4150-b209-0c68797de11e\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.819487 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-swift-storage-0\") pod \"5b99373e-a436-4150-b209-0c68797de11e\" (UID: \"5b99373e-a436-4150-b209-0c68797de11e\") " Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.829065 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b99373e-a436-4150-b209-0c68797de11e-kube-api-access-7thpz" (OuterVolumeSpecName: "kube-api-access-7thpz") pod "5b99373e-a436-4150-b209-0c68797de11e" (UID: "5b99373e-a436-4150-b209-0c68797de11e"). InnerVolumeSpecName "kube-api-access-7thpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.873041 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b99373e-a436-4150-b209-0c68797de11e" (UID: "5b99373e-a436-4150-b209-0c68797de11e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.873850 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b99373e-a436-4150-b209-0c68797de11e" (UID: "5b99373e-a436-4150-b209-0c68797de11e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.905164 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b99373e-a436-4150-b209-0c68797de11e" (UID: "5b99373e-a436-4150-b209-0c68797de11e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.905210 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5b99373e-a436-4150-b209-0c68797de11e" (UID: "5b99373e-a436-4150-b209-0c68797de11e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.907391 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-config" (OuterVolumeSpecName: "config") pod "5b99373e-a436-4150-b209-0c68797de11e" (UID: "5b99373e-a436-4150-b209-0c68797de11e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.921839 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.921863 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.921872 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.921906 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.921916 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7thpz\" (UniqueName: \"kubernetes.io/projected/5b99373e-a436-4150-b209-0c68797de11e-kube-api-access-7thpz\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:36 crc kubenswrapper[4626]: I0223 07:03:36.921926 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b99373e-a436-4150-b209-0c68797de11e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.202051 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cdb55cb5c-2pfm4"] Feb 23 07:03:37 crc kubenswrapper[4626]: W0223 07:03:37.205419 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f9703d1_1761_47e8_8524_c52def1bcac3.slice/crio-4473903c20c2723a36d64ce71ac99a211831381a93f1c269495e56d8f9766803 WatchSource:0}: Error finding container 4473903c20c2723a36d64ce71ac99a211831381a93f1c269495e56d8f9766803: Status 404 returned error can't find the container with id 4473903c20c2723a36d64ce71ac99a211831381a93f1c269495e56d8f9766803 Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.540364 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" event={"ID":"5b99373e-a436-4150-b209-0c68797de11e","Type":"ContainerDied","Data":"3ad139070188a7b901a3f5bed581583e79924e5151957102fe2eac0bf122f471"} Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.540445 4626 scope.go:117] "RemoveContainer" containerID="75f7038c6583ef2394671fa64d9fe699d65c2b64bbcae274e4afd7764b492157" Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.540381 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746ddbbc65-7nl6w" Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.541899 4626 generic.go:334] "Generic (PLEG): container finished" podID="5f9703d1-1761-47e8-8524-c52def1bcac3" containerID="24541e76ddfcf57665710caa5f40e60c53058927442387b99512e63edf9953d7" exitCode=0 Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.541954 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" event={"ID":"5f9703d1-1761-47e8-8524-c52def1bcac3","Type":"ContainerDied","Data":"24541e76ddfcf57665710caa5f40e60c53058927442387b99512e63edf9953d7"} Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.541992 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" event={"ID":"5f9703d1-1761-47e8-8524-c52def1bcac3","Type":"ContainerStarted","Data":"4473903c20c2723a36d64ce71ac99a211831381a93f1c269495e56d8f9766803"} Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.562468 4626 scope.go:117] "RemoveContainer" containerID="714c8abfefee636df50db9183876aaea31f1ac358cf90edb605723698af4399e" Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.752212 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746ddbbc65-7nl6w"] Feb 23 07:03:37 crc kubenswrapper[4626]: I0223 07:03:37.761669 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-746ddbbc65-7nl6w"] Feb 23 07:03:38 crc kubenswrapper[4626]: I0223 07:03:38.036005 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b99373e-a436-4150-b209-0c68797de11e" path="/var/lib/kubelet/pods/5b99373e-a436-4150-b209-0c68797de11e/volumes" Feb 23 07:03:38 crc kubenswrapper[4626]: I0223 07:03:38.555224 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" event={"ID":"5f9703d1-1761-47e8-8524-c52def1bcac3","Type":"ContainerStarted","Data":"5eac9079b6775c96b795ecdafb9182632a3aeea2914a95a57b6655822c1c7d51"} Feb 23 07:03:38 crc kubenswrapper[4626]: I0223 07:03:38.555636 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:38 crc kubenswrapper[4626]: I0223 07:03:38.575758 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" podStartSLOduration=2.575735774 podStartE2EDuration="2.575735774s" podCreationTimestamp="2026-02-23 07:03:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:03:38.572308122 +0000 UTC m=+1370.911637387" watchObservedRunningTime="2026-02-23 07:03:38.575735774 +0000 UTC m=+1370.915065040" Feb 23 07:03:46 crc kubenswrapper[4626]: I0223 07:03:46.754489 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cdb55cb5c-2pfm4" Feb 23 07:03:46 crc kubenswrapper[4626]: I0223 07:03:46.834258 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c6ffc5cc-prjn9"] Feb 23 07:03:46 crc kubenswrapper[4626]: I0223 07:03:46.834774 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerName="dnsmasq-dns" containerID="cri-o://35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501" gracePeriod=10 Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.278871 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.462554 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djx4w\" (UniqueName: \"kubernetes.io/projected/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-kube-api-access-djx4w\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.462632 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-swift-storage-0\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.462775 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-svc\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.462813 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-nb\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.463622 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-sb\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.463664 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-openstack-edpm-ipam\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.463738 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-config\") pod \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\" (UID: \"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb\") " Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.469724 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-kube-api-access-djx4w" (OuterVolumeSpecName: "kube-api-access-djx4w") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "kube-api-access-djx4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.508637 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.510009 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.512686 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.516066 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-config" (OuterVolumeSpecName: "config") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.520575 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.531734 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" (UID: "ad7cfdaa-5068-4627-bd89-10e8c8aa7afb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567736 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567774 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567796 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567811 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djx4w\" (UniqueName: \"kubernetes.io/projected/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-kube-api-access-djx4w\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567823 4626 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567834 4626 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.567923 4626 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.651854 4626 generic.go:334] "Generic (PLEG): container finished" podID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerID="35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501" exitCode=0 Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.651923 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" event={"ID":"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb","Type":"ContainerDied","Data":"35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501"} Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.651951 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.652001 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c6ffc5cc-prjn9" event={"ID":"ad7cfdaa-5068-4627-bd89-10e8c8aa7afb","Type":"ContainerDied","Data":"bce1d09635e26bc1a07119dac15da893f34b516735c501b7e6830df185139b30"} Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.652030 4626 scope.go:117] "RemoveContainer" containerID="35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.677980 4626 scope.go:117] "RemoveContainer" containerID="00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.698914 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c6ffc5cc-prjn9"] Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.703133 4626 scope.go:117] "RemoveContainer" containerID="35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501" Feb 23 07:03:47 crc kubenswrapper[4626]: E0223 07:03:47.703528 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501\": container with ID starting with 35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501 not found: ID does not exist" containerID="35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.703577 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501"} err="failed to get container status \"35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501\": rpc error: code = NotFound desc = could not find container \"35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501\": container with ID starting with 35671b7caa51d9656cedc75ef5546a1a64e46b21befc8c821aaa1941dca20501 not found: ID does not exist" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.703612 4626 scope.go:117] "RemoveContainer" containerID="00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3" Feb 23 07:03:47 crc kubenswrapper[4626]: E0223 07:03:47.704092 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3\": container with ID starting with 00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3 not found: ID does not exist" containerID="00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.704139 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3"} err="failed to get container status \"00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3\": rpc error: code = NotFound desc = could not find container \"00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3\": container with ID starting with 00ba21b50b03ea36eebfd85aec03925c20dd9ec24c87f22226b8182c04c8f4c3 not found: ID does not exist" Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.711922 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c6ffc5cc-prjn9"] Feb 23 07:03:47 crc kubenswrapper[4626]: I0223 07:03:47.993645 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" path="/var/lib/kubelet/pods/ad7cfdaa-5068-4627-bd89-10e8c8aa7afb/volumes" Feb 23 07:03:55 crc kubenswrapper[4626]: I0223 07:03:55.685570 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:03:55 crc kubenswrapper[4626]: I0223 07:03:55.686428 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.400427 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w"] Feb 23 07:03:59 crc kubenswrapper[4626]: E0223 07:03:59.401655 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b99373e-a436-4150-b209-0c68797de11e" containerName="init" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.401673 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b99373e-a436-4150-b209-0c68797de11e" containerName="init" Feb 23 07:03:59 crc kubenswrapper[4626]: E0223 07:03:59.401695 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerName="init" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.401702 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerName="init" Feb 23 07:03:59 crc kubenswrapper[4626]: E0223 07:03:59.401733 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b99373e-a436-4150-b209-0c68797de11e" containerName="dnsmasq-dns" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.401739 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b99373e-a436-4150-b209-0c68797de11e" containerName="dnsmasq-dns" Feb 23 07:03:59 crc kubenswrapper[4626]: E0223 07:03:59.401753 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerName="dnsmasq-dns" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.401758 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerName="dnsmasq-dns" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.401972 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7cfdaa-5068-4627-bd89-10e8c8aa7afb" containerName="dnsmasq-dns" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.402000 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b99373e-a436-4150-b209-0c68797de11e" containerName="dnsmasq-dns" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.402802 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.406763 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.408002 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.408075 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.408274 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.418567 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w"] Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.445552 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.445713 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xjw\" (UniqueName: \"kubernetes.io/projected/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-kube-api-access-w5xjw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.446193 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.446283 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.548554 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.548704 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.548767 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xjw\" (UniqueName: \"kubernetes.io/projected/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-kube-api-access-w5xjw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.549020 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.561530 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.561851 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.564696 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.565556 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xjw\" (UniqueName: \"kubernetes.io/projected/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-kube-api-access-w5xjw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c782w\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:03:59 crc kubenswrapper[4626]: I0223 07:03:59.722630 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:04:00 crc kubenswrapper[4626]: I0223 07:04:00.342736 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w"] Feb 23 07:04:00 crc kubenswrapper[4626]: I0223 07:04:00.350668 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:04:00 crc kubenswrapper[4626]: I0223 07:04:00.789103 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" event={"ID":"5bd897ec-9ff1-4dc4-87c5-db910e8593e4","Type":"ContainerStarted","Data":"6a08928b4c882de81be09742c8ee76723877d781f90e64e6fd007fb348e0c973"} Feb 23 07:04:01 crc kubenswrapper[4626]: I0223 07:04:01.806618 4626 generic.go:334] "Generic (PLEG): container finished" podID="3d9572dd-1a1c-4261-a7ba-7538d24d769a" containerID="9e657c1f1b6db60b7a1d6f17664109ed7c6db056f3134efb3d54c990e758c7e3" exitCode=0 Feb 23 07:04:01 crc kubenswrapper[4626]: I0223 07:04:01.806696 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d9572dd-1a1c-4261-a7ba-7538d24d769a","Type":"ContainerDied","Data":"9e657c1f1b6db60b7a1d6f17664109ed7c6db056f3134efb3d54c990e758c7e3"} Feb 23 07:04:02 crc kubenswrapper[4626]: I0223 07:04:02.823260 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d9572dd-1a1c-4261-a7ba-7538d24d769a","Type":"ContainerStarted","Data":"30d0e44b746bfe0ff174ef128fde547a36a6e39e8a5b9c13da8d6ab75bf9a04d"} Feb 23 07:04:02 crc kubenswrapper[4626]: I0223 07:04:02.823986 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:02 crc kubenswrapper[4626]: I0223 07:04:02.828626 4626 generic.go:334] "Generic (PLEG): container finished" podID="969fc15c-8289-4aa6-b590-9fa59f05783b" containerID="d02fe842839bcf302c9ac44c0ccda89a46e05ee2c749b97d6fe9313b894b3fcb" exitCode=0 Feb 23 07:04:02 crc kubenswrapper[4626]: I0223 07:04:02.828671 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"969fc15c-8289-4aa6-b590-9fa59f05783b","Type":"ContainerDied","Data":"d02fe842839bcf302c9ac44c0ccda89a46e05ee2c749b97d6fe9313b894b3fcb"} Feb 23 07:04:02 crc kubenswrapper[4626]: I0223 07:04:02.855940 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.855921008 podStartE2EDuration="35.855921008s" podCreationTimestamp="2026-02-23 07:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:04:02.854922877 +0000 UTC m=+1395.194252143" watchObservedRunningTime="2026-02-23 07:04:02.855921008 +0000 UTC m=+1395.195250275" Feb 23 07:04:03 crc kubenswrapper[4626]: I0223 07:04:03.853289 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"969fc15c-8289-4aa6-b590-9fa59f05783b","Type":"ContainerStarted","Data":"cd3c8e2d5719bc70635cb9e11d3d7e611f65ae94d948fcac3b81e4f792036b4b"} Feb 23 07:04:03 crc kubenswrapper[4626]: I0223 07:04:03.854008 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 07:04:03 crc kubenswrapper[4626]: I0223 07:04:03.895194 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.895157688 podStartE2EDuration="36.895157688s" podCreationTimestamp="2026-02-23 07:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:04:03.892350906 +0000 UTC m=+1396.231680172" watchObservedRunningTime="2026-02-23 07:04:03.895157688 +0000 UTC m=+1396.234486954" Feb 23 07:04:10 crc kubenswrapper[4626]: I0223 07:04:10.962549 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" event={"ID":"5bd897ec-9ff1-4dc4-87c5-db910e8593e4","Type":"ContainerStarted","Data":"50443f8ff42f597f9da155cdd0c17746c602c08fd9f000530e53a4f962169b41"} Feb 23 07:04:10 crc kubenswrapper[4626]: I0223 07:04:10.980070 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" podStartSLOduration=1.924803775 podStartE2EDuration="11.980043363s" podCreationTimestamp="2026-02-23 07:03:59 +0000 UTC" firstStartedPulling="2026-02-23 07:04:00.349726661 +0000 UTC m=+1392.689055927" lastFinishedPulling="2026-02-23 07:04:10.404966249 +0000 UTC m=+1402.744295515" observedRunningTime="2026-02-23 07:04:10.975575889 +0000 UTC m=+1403.314905155" watchObservedRunningTime="2026-02-23 07:04:10.980043363 +0000 UTC m=+1403.319372629" Feb 23 07:04:17 crc kubenswrapper[4626]: I0223 07:04:17.998921 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 07:04:18 crc kubenswrapper[4626]: I0223 07:04:18.163774 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:18 crc kubenswrapper[4626]: I0223 07:04:18.901975 4626 scope.go:117] "RemoveContainer" containerID="335080fdbc50ec94238cbe05a5d58e68c601cf4173ffeb5e5ce58f7cbd592c4a" Feb 23 07:04:18 crc kubenswrapper[4626]: I0223 07:04:18.934405 4626 scope.go:117] "RemoveContainer" containerID="06e385e38283193cdbfcddd982ff856e4477c5930a30d9606b70b08d9fcacc08" Feb 23 07:04:22 crc kubenswrapper[4626]: I0223 07:04:22.110176 4626 generic.go:334] "Generic (PLEG): container finished" podID="5bd897ec-9ff1-4dc4-87c5-db910e8593e4" containerID="50443f8ff42f597f9da155cdd0c17746c602c08fd9f000530e53a4f962169b41" exitCode=0 Feb 23 07:04:22 crc kubenswrapper[4626]: I0223 07:04:22.110216 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" event={"ID":"5bd897ec-9ff1-4dc4-87c5-db910e8593e4","Type":"ContainerDied","Data":"50443f8ff42f597f9da155cdd0c17746c602c08fd9f000530e53a4f962169b41"} Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.532826 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.598294 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-ssh-key-openstack-edpm-ipam\") pod \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.598388 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xjw\" (UniqueName: \"kubernetes.io/projected/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-kube-api-access-w5xjw\") pod \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.598427 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-repo-setup-combined-ca-bundle\") pod \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.598857 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-inventory\") pod \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\" (UID: \"5bd897ec-9ff1-4dc4-87c5-db910e8593e4\") " Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.605029 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5bd897ec-9ff1-4dc4-87c5-db910e8593e4" (UID: "5bd897ec-9ff1-4dc4-87c5-db910e8593e4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.605381 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-kube-api-access-w5xjw" (OuterVolumeSpecName: "kube-api-access-w5xjw") pod "5bd897ec-9ff1-4dc4-87c5-db910e8593e4" (UID: "5bd897ec-9ff1-4dc4-87c5-db910e8593e4"). InnerVolumeSpecName "kube-api-access-w5xjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.622270 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5bd897ec-9ff1-4dc4-87c5-db910e8593e4" (UID: "5bd897ec-9ff1-4dc4-87c5-db910e8593e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.626949 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-inventory" (OuterVolumeSpecName: "inventory") pod "5bd897ec-9ff1-4dc4-87c5-db910e8593e4" (UID: "5bd897ec-9ff1-4dc4-87c5-db910e8593e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.701907 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.701941 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.701955 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5xjw\" (UniqueName: \"kubernetes.io/projected/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-kube-api-access-w5xjw\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:23 crc kubenswrapper[4626]: I0223 07:04:23.701984 4626 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd897ec-9ff1-4dc4-87c5-db910e8593e4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.135221 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" event={"ID":"5bd897ec-9ff1-4dc4-87c5-db910e8593e4","Type":"ContainerDied","Data":"6a08928b4c882de81be09742c8ee76723877d781f90e64e6fd007fb348e0c973"} Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.135287 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a08928b4c882de81be09742c8ee76723877d781f90e64e6fd007fb348e0c973" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.135551 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c782w" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.202325 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm"] Feb 23 07:04:24 crc kubenswrapper[4626]: E0223 07:04:24.202833 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd897ec-9ff1-4dc4-87c5-db910e8593e4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.202857 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd897ec-9ff1-4dc4-87c5-db910e8593e4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.203088 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd897ec-9ff1-4dc4-87c5-db910e8593e4" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.204770 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.207281 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.207487 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.207773 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.208678 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.212255 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.212348 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtvxs\" (UniqueName: \"kubernetes.io/projected/1d465e63-5644-4732-a661-1134ffa03a78-kube-api-access-mtvxs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.212434 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.220327 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm"] Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.315218 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.315749 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.315997 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtvxs\" (UniqueName: \"kubernetes.io/projected/1d465e63-5644-4732-a661-1134ffa03a78-kube-api-access-mtvxs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.319883 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.320293 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.331697 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtvxs\" (UniqueName: \"kubernetes.io/projected/1d465e63-5644-4732-a661-1134ffa03a78-kube-api-access-mtvxs\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cb2zm\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:24 crc kubenswrapper[4626]: I0223 07:04:24.518675 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.066283 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm"] Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.144694 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" event={"ID":"1d465e63-5644-4732-a661-1134ffa03a78","Type":"ContainerStarted","Data":"6299bdf8c0074cf4186a9aa4f6640b447e196c9393bedb6cb65c26364972cca0"} Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.685182 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.685478 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.685555 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.687060 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ee7851d94adc9620d81b6807bd44d726c23bf9b55ab16cf5c2f4335cb177b50"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:04:25 crc kubenswrapper[4626]: I0223 07:04:25.687151 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://3ee7851d94adc9620d81b6807bd44d726c23bf9b55ab16cf5c2f4335cb177b50" gracePeriod=600 Feb 23 07:04:26 crc kubenswrapper[4626]: I0223 07:04:26.160782 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="3ee7851d94adc9620d81b6807bd44d726c23bf9b55ab16cf5c2f4335cb177b50" exitCode=0 Feb 23 07:04:26 crc kubenswrapper[4626]: I0223 07:04:26.160870 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"3ee7851d94adc9620d81b6807bd44d726c23bf9b55ab16cf5c2f4335cb177b50"} Feb 23 07:04:26 crc kubenswrapper[4626]: I0223 07:04:26.161146 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac"} Feb 23 07:04:26 crc kubenswrapper[4626]: I0223 07:04:26.161189 4626 scope.go:117] "RemoveContainer" containerID="a5470378b5d8c9e6d24ed5c140362129ba764fb086113e814588933f56ca4c24" Feb 23 07:04:26 crc kubenswrapper[4626]: I0223 07:04:26.163250 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" event={"ID":"1d465e63-5644-4732-a661-1134ffa03a78","Type":"ContainerStarted","Data":"520b82db1114c45710d7639022e88b749fc4625559afc3a5425f2d7f825a1f26"} Feb 23 07:04:26 crc kubenswrapper[4626]: I0223 07:04:26.199057 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" podStartSLOduration=1.663636276 podStartE2EDuration="2.199042409s" podCreationTimestamp="2026-02-23 07:04:24 +0000 UTC" firstStartedPulling="2026-02-23 07:04:25.071402164 +0000 UTC m=+1417.410731430" lastFinishedPulling="2026-02-23 07:04:25.606808296 +0000 UTC m=+1417.946137563" observedRunningTime="2026-02-23 07:04:26.197282442 +0000 UTC m=+1418.536611708" watchObservedRunningTime="2026-02-23 07:04:26.199042409 +0000 UTC m=+1418.538371676" Feb 23 07:04:28 crc kubenswrapper[4626]: I0223 07:04:28.185565 4626 generic.go:334] "Generic (PLEG): container finished" podID="1d465e63-5644-4732-a661-1134ffa03a78" containerID="520b82db1114c45710d7639022e88b749fc4625559afc3a5425f2d7f825a1f26" exitCode=0 Feb 23 07:04:28 crc kubenswrapper[4626]: I0223 07:04:28.185696 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" event={"ID":"1d465e63-5644-4732-a661-1134ffa03a78","Type":"ContainerDied","Data":"520b82db1114c45710d7639022e88b749fc4625559afc3a5425f2d7f825a1f26"} Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.544004 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.654549 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-ssh-key-openstack-edpm-ipam\") pod \"1d465e63-5644-4732-a661-1134ffa03a78\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.654624 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtvxs\" (UniqueName: \"kubernetes.io/projected/1d465e63-5644-4732-a661-1134ffa03a78-kube-api-access-mtvxs\") pod \"1d465e63-5644-4732-a661-1134ffa03a78\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.654880 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-inventory\") pod \"1d465e63-5644-4732-a661-1134ffa03a78\" (UID: \"1d465e63-5644-4732-a661-1134ffa03a78\") " Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.663619 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d465e63-5644-4732-a661-1134ffa03a78-kube-api-access-mtvxs" (OuterVolumeSpecName: "kube-api-access-mtvxs") pod "1d465e63-5644-4732-a661-1134ffa03a78" (UID: "1d465e63-5644-4732-a661-1134ffa03a78"). InnerVolumeSpecName "kube-api-access-mtvxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.683570 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1d465e63-5644-4732-a661-1134ffa03a78" (UID: "1d465e63-5644-4732-a661-1134ffa03a78"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.688360 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-inventory" (OuterVolumeSpecName: "inventory") pod "1d465e63-5644-4732-a661-1134ffa03a78" (UID: "1d465e63-5644-4732-a661-1134ffa03a78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.758345 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.758380 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d465e63-5644-4732-a661-1134ffa03a78-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:29 crc kubenswrapper[4626]: I0223 07:04:29.758393 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtvxs\" (UniqueName: \"kubernetes.io/projected/1d465e63-5644-4732-a661-1134ffa03a78-kube-api-access-mtvxs\") on node \"crc\" DevicePath \"\"" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.212406 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" event={"ID":"1d465e63-5644-4732-a661-1134ffa03a78","Type":"ContainerDied","Data":"6299bdf8c0074cf4186a9aa4f6640b447e196c9393bedb6cb65c26364972cca0"} Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.212462 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6299bdf8c0074cf4186a9aa4f6640b447e196c9393bedb6cb65c26364972cca0" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.212537 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cb2zm" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.639710 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6"] Feb 23 07:04:30 crc kubenswrapper[4626]: E0223 07:04:30.640656 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d465e63-5644-4732-a661-1134ffa03a78" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.640675 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d465e63-5644-4732-a661-1134ffa03a78" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.640913 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d465e63-5644-4732-a661-1134ffa03a78" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.641721 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.643705 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.644568 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.644734 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.644799 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.662263 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6"] Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.676926 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.676957 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.677223 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.677261 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9lhw\" (UniqueName: \"kubernetes.io/projected/5d5da7da-f1d4-4a24-9a9b-e22d85625761-kube-api-access-p9lhw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.778866 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.778914 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.779073 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.779109 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9lhw\" (UniqueName: \"kubernetes.io/projected/5d5da7da-f1d4-4a24-9a9b-e22d85625761-kube-api-access-p9lhw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.785220 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.786165 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.788447 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.793853 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9lhw\" (UniqueName: \"kubernetes.io/projected/5d5da7da-f1d4-4a24-9a9b-e22d85625761-kube-api-access-p9lhw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:30 crc kubenswrapper[4626]: I0223 07:04:30.965604 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:04:31 crc kubenswrapper[4626]: I0223 07:04:31.489844 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6"] Feb 23 07:04:31 crc kubenswrapper[4626]: W0223 07:04:31.499566 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d5da7da_f1d4_4a24_9a9b_e22d85625761.slice/crio-bd5dd6768d8a0d417b762f8208aea5014850c00da0de48a98fc1aea4eb0105ed WatchSource:0}: Error finding container bd5dd6768d8a0d417b762f8208aea5014850c00da0de48a98fc1aea4eb0105ed: Status 404 returned error can't find the container with id bd5dd6768d8a0d417b762f8208aea5014850c00da0de48a98fc1aea4eb0105ed Feb 23 07:04:32 crc kubenswrapper[4626]: I0223 07:04:32.237450 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" event={"ID":"5d5da7da-f1d4-4a24-9a9b-e22d85625761","Type":"ContainerStarted","Data":"6fc699eef625bf7e6319594b95648677268098cd2904b6f0477cb28a460728e1"} Feb 23 07:04:32 crc kubenswrapper[4626]: I0223 07:04:32.237813 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" event={"ID":"5d5da7da-f1d4-4a24-9a9b-e22d85625761","Type":"ContainerStarted","Data":"bd5dd6768d8a0d417b762f8208aea5014850c00da0de48a98fc1aea4eb0105ed"} Feb 23 07:04:32 crc kubenswrapper[4626]: I0223 07:04:32.258171 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" podStartSLOduration=1.773626189 podStartE2EDuration="2.258139883s" podCreationTimestamp="2026-02-23 07:04:30 +0000 UTC" firstStartedPulling="2026-02-23 07:04:31.502434619 +0000 UTC m=+1423.841763876" lastFinishedPulling="2026-02-23 07:04:31.986948304 +0000 UTC m=+1424.326277570" observedRunningTime="2026-02-23 07:04:32.254041565 +0000 UTC m=+1424.593370831" watchObservedRunningTime="2026-02-23 07:04:32.258139883 +0000 UTC m=+1424.597469149" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.807088 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gw24g"] Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.810040 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.816703 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw24g"] Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.876672 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-catalog-content\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.876825 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pqgt\" (UniqueName: \"kubernetes.io/projected/edd4ae4f-4917-41aa-8c19-65e78823b8c4-kube-api-access-5pqgt\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.877188 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-utilities\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.979725 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-utilities\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.979872 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-catalog-content\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.979956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pqgt\" (UniqueName: \"kubernetes.io/projected/edd4ae4f-4917-41aa-8c19-65e78823b8c4-kube-api-access-5pqgt\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.980189 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-utilities\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:03 crc kubenswrapper[4626]: I0223 07:05:03.980269 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-catalog-content\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:04 crc kubenswrapper[4626]: I0223 07:05:04.002044 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pqgt\" (UniqueName: \"kubernetes.io/projected/edd4ae4f-4917-41aa-8c19-65e78823b8c4-kube-api-access-5pqgt\") pod \"redhat-marketplace-gw24g\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:04 crc kubenswrapper[4626]: I0223 07:05:04.128291 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:04 crc kubenswrapper[4626]: I0223 07:05:04.553836 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw24g"] Feb 23 07:05:04 crc kubenswrapper[4626]: I0223 07:05:04.586531 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerStarted","Data":"002f554735f13b64adf538056bf17b98e286e229e58cd9372996f639cb64dcbe"} Feb 23 07:05:05 crc kubenswrapper[4626]: I0223 07:05:05.598109 4626 generic.go:334] "Generic (PLEG): container finished" podID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerID="83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1" exitCode=0 Feb 23 07:05:05 crc kubenswrapper[4626]: I0223 07:05:05.598244 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerDied","Data":"83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1"} Feb 23 07:05:06 crc kubenswrapper[4626]: I0223 07:05:06.617928 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerStarted","Data":"ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07"} Feb 23 07:05:07 crc kubenswrapper[4626]: I0223 07:05:07.631081 4626 generic.go:334] "Generic (PLEG): container finished" podID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerID="ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07" exitCode=0 Feb 23 07:05:07 crc kubenswrapper[4626]: I0223 07:05:07.631139 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerDied","Data":"ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07"} Feb 23 07:05:08 crc kubenswrapper[4626]: I0223 07:05:08.643566 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerStarted","Data":"94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d"} Feb 23 07:05:08 crc kubenswrapper[4626]: I0223 07:05:08.671706 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gw24g" podStartSLOduration=3.16799129 podStartE2EDuration="5.671681636s" podCreationTimestamp="2026-02-23 07:05:03 +0000 UTC" firstStartedPulling="2026-02-23 07:05:05.601526464 +0000 UTC m=+1457.940855731" lastFinishedPulling="2026-02-23 07:05:08.105216811 +0000 UTC m=+1460.444546077" observedRunningTime="2026-02-23 07:05:08.66129992 +0000 UTC m=+1461.000629186" watchObservedRunningTime="2026-02-23 07:05:08.671681636 +0000 UTC m=+1461.011010902" Feb 23 07:05:14 crc kubenswrapper[4626]: I0223 07:05:14.128727 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:14 crc kubenswrapper[4626]: I0223 07:05:14.129449 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:14 crc kubenswrapper[4626]: I0223 07:05:14.169929 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:14 crc kubenswrapper[4626]: I0223 07:05:14.744410 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:15 crc kubenswrapper[4626]: I0223 07:05:15.235227 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw24g"] Feb 23 07:05:16 crc kubenswrapper[4626]: I0223 07:05:16.723600 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gw24g" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="registry-server" containerID="cri-o://94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d" gracePeriod=2 Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.130211 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.215110 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-catalog-content\") pod \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.215364 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pqgt\" (UniqueName: \"kubernetes.io/projected/edd4ae4f-4917-41aa-8c19-65e78823b8c4-kube-api-access-5pqgt\") pod \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.215790 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-utilities\") pod \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\" (UID: \"edd4ae4f-4917-41aa-8c19-65e78823b8c4\") " Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.216285 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-utilities" (OuterVolumeSpecName: "utilities") pod "edd4ae4f-4917-41aa-8c19-65e78823b8c4" (UID: "edd4ae4f-4917-41aa-8c19-65e78823b8c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.217066 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.226902 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd4ae4f-4917-41aa-8c19-65e78823b8c4-kube-api-access-5pqgt" (OuterVolumeSpecName: "kube-api-access-5pqgt") pod "edd4ae4f-4917-41aa-8c19-65e78823b8c4" (UID: "edd4ae4f-4917-41aa-8c19-65e78823b8c4"). InnerVolumeSpecName "kube-api-access-5pqgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.233144 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edd4ae4f-4917-41aa-8c19-65e78823b8c4" (UID: "edd4ae4f-4917-41aa-8c19-65e78823b8c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.319462 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edd4ae4f-4917-41aa-8c19-65e78823b8c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.319648 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pqgt\" (UniqueName: \"kubernetes.io/projected/edd4ae4f-4917-41aa-8c19-65e78823b8c4-kube-api-access-5pqgt\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.735948 4626 generic.go:334] "Generic (PLEG): container finished" podID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerID="94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d" exitCode=0 Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.736018 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerDied","Data":"94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d"} Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.736083 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gw24g" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.737615 4626 scope.go:117] "RemoveContainer" containerID="94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.737522 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gw24g" event={"ID":"edd4ae4f-4917-41aa-8c19-65e78823b8c4","Type":"ContainerDied","Data":"002f554735f13b64adf538056bf17b98e286e229e58cd9372996f639cb64dcbe"} Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.786799 4626 scope.go:117] "RemoveContainer" containerID="ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.805580 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw24g"] Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.832076 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gw24g"] Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.883710 4626 scope.go:117] "RemoveContainer" containerID="83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.929930 4626 scope.go:117] "RemoveContainer" containerID="94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d" Feb 23 07:05:17 crc kubenswrapper[4626]: E0223 07:05:17.930402 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d\": container with ID starting with 94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d not found: ID does not exist" containerID="94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.930430 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d"} err="failed to get container status \"94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d\": rpc error: code = NotFound desc = could not find container \"94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d\": container with ID starting with 94e510e29ac3997eb737e61d562ab35feef795c5830323b6397ddcc41c60db0d not found: ID does not exist" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.930450 4626 scope.go:117] "RemoveContainer" containerID="ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07" Feb 23 07:05:17 crc kubenswrapper[4626]: E0223 07:05:17.930705 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07\": container with ID starting with ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07 not found: ID does not exist" containerID="ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.930725 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07"} err="failed to get container status \"ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07\": rpc error: code = NotFound desc = could not find container \"ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07\": container with ID starting with ba272ee9dc5d5eb1496bc18b56c98c686fc8926a28b289cfd7a525c051ccef07 not found: ID does not exist" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.930738 4626 scope.go:117] "RemoveContainer" containerID="83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1" Feb 23 07:05:17 crc kubenswrapper[4626]: E0223 07:05:17.930978 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1\": container with ID starting with 83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1 not found: ID does not exist" containerID="83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.930997 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1"} err="failed to get container status \"83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1\": rpc error: code = NotFound desc = could not find container \"83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1\": container with ID starting with 83c30fce911690f575deb55b69ee79500d5317893e21ba5783ebdb2f90fe81f1 not found: ID does not exist" Feb 23 07:05:17 crc kubenswrapper[4626]: I0223 07:05:17.990824 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" path="/var/lib/kubelet/pods/edd4ae4f-4917-41aa-8c19-65e78823b8c4/volumes" Feb 23 07:05:19 crc kubenswrapper[4626]: I0223 07:05:19.089333 4626 scope.go:117] "RemoveContainer" containerID="1e469c0ada8f0d6a21352ad236a3ffa6bfae564238ce58c5b5526e59a3c383d8" Feb 23 07:05:19 crc kubenswrapper[4626]: I0223 07:05:19.112442 4626 scope.go:117] "RemoveContainer" containerID="6209b813b4d475ba8bf98581450d37fa18b16e29c0ca127ea5be5ce2a3cebccc" Feb 23 07:05:19 crc kubenswrapper[4626]: I0223 07:05:19.139869 4626 scope.go:117] "RemoveContainer" containerID="217bf194e4e6f031523cd713c7345593b7735ac7dafeaf1a492d3bf1998076dd" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.151152 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2v7gf"] Feb 23 07:06:14 crc kubenswrapper[4626]: E0223 07:06:14.152146 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="registry-server" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.152160 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="registry-server" Feb 23 07:06:14 crc kubenswrapper[4626]: E0223 07:06:14.152174 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="extract-content" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.152180 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="extract-content" Feb 23 07:06:14 crc kubenswrapper[4626]: E0223 07:06:14.152212 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="extract-utilities" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.152218 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="extract-utilities" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.152402 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd4ae4f-4917-41aa-8c19-65e78823b8c4" containerName="registry-server" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.153696 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.164577 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v7gf"] Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.169366 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-utilities\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.169448 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvn7n\" (UniqueName: \"kubernetes.io/projected/c3bbb358-b0aa-4501-ab41-3b9369d3043d-kube-api-access-kvn7n\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.169615 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-catalog-content\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.272727 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-utilities\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.272908 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvn7n\" (UniqueName: \"kubernetes.io/projected/c3bbb358-b0aa-4501-ab41-3b9369d3043d-kube-api-access-kvn7n\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.273318 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-utilities\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.273341 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-catalog-content\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.273746 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-catalog-content\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.293037 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvn7n\" (UniqueName: \"kubernetes.io/projected/c3bbb358-b0aa-4501-ab41-3b9369d3043d-kube-api-access-kvn7n\") pod \"community-operators-2v7gf\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.476335 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:14 crc kubenswrapper[4626]: I0223 07:06:14.955574 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v7gf"] Feb 23 07:06:15 crc kubenswrapper[4626]: I0223 07:06:15.324557 4626 generic.go:334] "Generic (PLEG): container finished" podID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerID="8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a" exitCode=0 Feb 23 07:06:15 crc kubenswrapper[4626]: I0223 07:06:15.324632 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerDied","Data":"8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a"} Feb 23 07:06:15 crc kubenswrapper[4626]: I0223 07:06:15.325598 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerStarted","Data":"dcf120edb3e701baadbea2e8d0247acad5539749edea6d1df1afea288dd763a5"} Feb 23 07:06:16 crc kubenswrapper[4626]: I0223 07:06:16.337751 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerStarted","Data":"d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be"} Feb 23 07:06:17 crc kubenswrapper[4626]: I0223 07:06:17.353588 4626 generic.go:334] "Generic (PLEG): container finished" podID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerID="d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be" exitCode=0 Feb 23 07:06:17 crc kubenswrapper[4626]: I0223 07:06:17.353649 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerDied","Data":"d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be"} Feb 23 07:06:18 crc kubenswrapper[4626]: I0223 07:06:18.368292 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerStarted","Data":"fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244"} Feb 23 07:06:18 crc kubenswrapper[4626]: I0223 07:06:18.397341 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2v7gf" podStartSLOduration=1.908378715 podStartE2EDuration="4.397322687s" podCreationTimestamp="2026-02-23 07:06:14 +0000 UTC" firstStartedPulling="2026-02-23 07:06:15.32680874 +0000 UTC m=+1527.666138006" lastFinishedPulling="2026-02-23 07:06:17.815752712 +0000 UTC m=+1530.155081978" observedRunningTime="2026-02-23 07:06:18.384623643 +0000 UTC m=+1530.723952909" watchObservedRunningTime="2026-02-23 07:06:18.397322687 +0000 UTC m=+1530.736651953" Feb 23 07:06:24 crc kubenswrapper[4626]: I0223 07:06:24.477402 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:24 crc kubenswrapper[4626]: I0223 07:06:24.478121 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:24 crc kubenswrapper[4626]: I0223 07:06:24.527599 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:25 crc kubenswrapper[4626]: I0223 07:06:25.506186 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:25 crc kubenswrapper[4626]: I0223 07:06:25.561805 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v7gf"] Feb 23 07:06:25 crc kubenswrapper[4626]: I0223 07:06:25.685933 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:06:25 crc kubenswrapper[4626]: I0223 07:06:25.686319 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:06:27 crc kubenswrapper[4626]: I0223 07:06:27.478369 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2v7gf" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="registry-server" containerID="cri-o://fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244" gracePeriod=2 Feb 23 07:06:27 crc kubenswrapper[4626]: I0223 07:06:27.922693 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.122439 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-utilities\") pod \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.122545 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvn7n\" (UniqueName: \"kubernetes.io/projected/c3bbb358-b0aa-4501-ab41-3b9369d3043d-kube-api-access-kvn7n\") pod \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.122702 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-catalog-content\") pod \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\" (UID: \"c3bbb358-b0aa-4501-ab41-3b9369d3043d\") " Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.123461 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-utilities" (OuterVolumeSpecName: "utilities") pod "c3bbb358-b0aa-4501-ab41-3b9369d3043d" (UID: "c3bbb358-b0aa-4501-ab41-3b9369d3043d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.123755 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.129182 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bbb358-b0aa-4501-ab41-3b9369d3043d-kube-api-access-kvn7n" (OuterVolumeSpecName: "kube-api-access-kvn7n") pod "c3bbb358-b0aa-4501-ab41-3b9369d3043d" (UID: "c3bbb358-b0aa-4501-ab41-3b9369d3043d"). InnerVolumeSpecName "kube-api-access-kvn7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.169779 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3bbb358-b0aa-4501-ab41-3b9369d3043d" (UID: "c3bbb358-b0aa-4501-ab41-3b9369d3043d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.225571 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvn7n\" (UniqueName: \"kubernetes.io/projected/c3bbb358-b0aa-4501-ab41-3b9369d3043d-kube-api-access-kvn7n\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.225766 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bbb358-b0aa-4501-ab41-3b9369d3043d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.492310 4626 generic.go:334] "Generic (PLEG): container finished" podID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerID="fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244" exitCode=0 Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.492391 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerDied","Data":"fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244"} Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.492418 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v7gf" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.492443 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v7gf" event={"ID":"c3bbb358-b0aa-4501-ab41-3b9369d3043d","Type":"ContainerDied","Data":"dcf120edb3e701baadbea2e8d0247acad5539749edea6d1df1afea288dd763a5"} Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.492466 4626 scope.go:117] "RemoveContainer" containerID="fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.531986 4626 scope.go:117] "RemoveContainer" containerID="d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.532259 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v7gf"] Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.546943 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2v7gf"] Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.563012 4626 scope.go:117] "RemoveContainer" containerID="8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.602839 4626 scope.go:117] "RemoveContainer" containerID="fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244" Feb 23 07:06:28 crc kubenswrapper[4626]: E0223 07:06:28.603269 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244\": container with ID starting with fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244 not found: ID does not exist" containerID="fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.603312 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244"} err="failed to get container status \"fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244\": rpc error: code = NotFound desc = could not find container \"fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244\": container with ID starting with fc01047bb4c5eefca3f8b29356bd797711994e0ab79ea5ab7b52448d81aa5244 not found: ID does not exist" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.603338 4626 scope.go:117] "RemoveContainer" containerID="d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be" Feb 23 07:06:28 crc kubenswrapper[4626]: E0223 07:06:28.603722 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be\": container with ID starting with d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be not found: ID does not exist" containerID="d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.603746 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be"} err="failed to get container status \"d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be\": rpc error: code = NotFound desc = could not find container \"d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be\": container with ID starting with d65dfbbbfcd03637d3d81ee01caff593694e2814f1b86b9a719d7f32369244be not found: ID does not exist" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.603764 4626 scope.go:117] "RemoveContainer" containerID="8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a" Feb 23 07:06:28 crc kubenswrapper[4626]: E0223 07:06:28.604051 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a\": container with ID starting with 8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a not found: ID does not exist" containerID="8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a" Feb 23 07:06:28 crc kubenswrapper[4626]: I0223 07:06:28.604073 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a"} err="failed to get container status \"8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a\": rpc error: code = NotFound desc = could not find container \"8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a\": container with ID starting with 8651aaf762b6aad2c54e1f1c5bf3af2f2a7c29d1771a3a7f012390a3fc67d61a not found: ID does not exist" Feb 23 07:06:29 crc kubenswrapper[4626]: I0223 07:06:29.995904 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" path="/var/lib/kubelet/pods/c3bbb358-b0aa-4501-ab41-3b9369d3043d/volumes" Feb 23 07:06:55 crc kubenswrapper[4626]: I0223 07:06:55.684873 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:06:55 crc kubenswrapper[4626]: I0223 07:06:55.685386 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:07:25 crc kubenswrapper[4626]: I0223 07:07:25.685092 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:07:25 crc kubenswrapper[4626]: I0223 07:07:25.685859 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:07:25 crc kubenswrapper[4626]: I0223 07:07:25.685929 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:07:25 crc kubenswrapper[4626]: I0223 07:07:25.686775 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:07:25 crc kubenswrapper[4626]: I0223 07:07:25.686855 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" gracePeriod=600 Feb 23 07:07:25 crc kubenswrapper[4626]: E0223 07:07:25.808490 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:07:26 crc kubenswrapper[4626]: I0223 07:07:26.119113 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" exitCode=0 Feb 23 07:07:26 crc kubenswrapper[4626]: I0223 07:07:26.119160 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac"} Feb 23 07:07:26 crc kubenswrapper[4626]: I0223 07:07:26.119231 4626 scope.go:117] "RemoveContainer" containerID="3ee7851d94adc9620d81b6807bd44d726c23bf9b55ab16cf5c2f4335cb177b50" Feb 23 07:07:26 crc kubenswrapper[4626]: I0223 07:07:26.119961 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:07:26 crc kubenswrapper[4626]: E0223 07:07:26.120304 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:07:38 crc kubenswrapper[4626]: I0223 07:07:38.982993 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:07:38 crc kubenswrapper[4626]: E0223 07:07:38.984183 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:07:44 crc kubenswrapper[4626]: I0223 07:07:44.288427 4626 generic.go:334] "Generic (PLEG): container finished" podID="5d5da7da-f1d4-4a24-9a9b-e22d85625761" containerID="6fc699eef625bf7e6319594b95648677268098cd2904b6f0477cb28a460728e1" exitCode=0 Feb 23 07:07:44 crc kubenswrapper[4626]: I0223 07:07:44.288548 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" event={"ID":"5d5da7da-f1d4-4a24-9a9b-e22d85625761","Type":"ContainerDied","Data":"6fc699eef625bf7e6319594b95648677268098cd2904b6f0477cb28a460728e1"} Feb 23 07:07:45 crc kubenswrapper[4626]: I0223 07:07:45.941571 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:07:45 crc kubenswrapper[4626]: I0223 07:07:45.983130 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9lhw\" (UniqueName: \"kubernetes.io/projected/5d5da7da-f1d4-4a24-9a9b-e22d85625761-kube-api-access-p9lhw\") pod \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " Feb 23 07:07:45 crc kubenswrapper[4626]: I0223 07:07:45.983198 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-ssh-key-openstack-edpm-ipam\") pod \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " Feb 23 07:07:45 crc kubenswrapper[4626]: I0223 07:07:45.983272 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-bootstrap-combined-ca-bundle\") pod \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " Feb 23 07:07:45 crc kubenswrapper[4626]: I0223 07:07:45.991365 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5d5da7da-f1d4-4a24-9a9b-e22d85625761" (UID: "5d5da7da-f1d4-4a24-9a9b-e22d85625761"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.000735 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5da7da-f1d4-4a24-9a9b-e22d85625761-kube-api-access-p9lhw" (OuterVolumeSpecName: "kube-api-access-p9lhw") pod "5d5da7da-f1d4-4a24-9a9b-e22d85625761" (UID: "5d5da7da-f1d4-4a24-9a9b-e22d85625761"). InnerVolumeSpecName "kube-api-access-p9lhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.016235 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5d5da7da-f1d4-4a24-9a9b-e22d85625761" (UID: "5d5da7da-f1d4-4a24-9a9b-e22d85625761"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.086137 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-inventory\") pod \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\" (UID: \"5d5da7da-f1d4-4a24-9a9b-e22d85625761\") " Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.087249 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9lhw\" (UniqueName: \"kubernetes.io/projected/5d5da7da-f1d4-4a24-9a9b-e22d85625761-kube-api-access-p9lhw\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.087278 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.087292 4626 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.108531 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-inventory" (OuterVolumeSpecName: "inventory") pod "5d5da7da-f1d4-4a24-9a9b-e22d85625761" (UID: "5d5da7da-f1d4-4a24-9a9b-e22d85625761"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.189142 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d5da7da-f1d4-4a24-9a9b-e22d85625761-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.314198 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" event={"ID":"5d5da7da-f1d4-4a24-9a9b-e22d85625761","Type":"ContainerDied","Data":"bd5dd6768d8a0d417b762f8208aea5014850c00da0de48a98fc1aea4eb0105ed"} Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.314245 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.314251 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5dd6768d8a0d417b762f8208aea5014850c00da0de48a98fc1aea4eb0105ed" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.415551 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k"] Feb 23 07:07:46 crc kubenswrapper[4626]: E0223 07:07:46.416093 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="registry-server" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.416124 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="registry-server" Feb 23 07:07:46 crc kubenswrapper[4626]: E0223 07:07:46.416152 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="extract-content" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.416164 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="extract-content" Feb 23 07:07:46 crc kubenswrapper[4626]: E0223 07:07:46.416182 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="extract-utilities" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.416189 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="extract-utilities" Feb 23 07:07:46 crc kubenswrapper[4626]: E0223 07:07:46.416196 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5da7da-f1d4-4a24-9a9b-e22d85625761" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.416203 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5da7da-f1d4-4a24-9a9b-e22d85625761" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.416439 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bbb358-b0aa-4501-ab41-3b9369d3043d" containerName="registry-server" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.416458 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5da7da-f1d4-4a24-9a9b-e22d85625761" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.417301 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.422793 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.423121 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.423408 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.423598 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.441197 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k"] Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.498487 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.498637 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.498918 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpb9\" (UniqueName: \"kubernetes.io/projected/b968ab81-8b5f-49c7-830b-220b90d6b1f1-kube-api-access-bzpb9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.601430 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.601562 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.601682 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpb9\" (UniqueName: \"kubernetes.io/projected/b968ab81-8b5f-49c7-830b-220b90d6b1f1-kube-api-access-bzpb9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.606156 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.606277 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.616659 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpb9\" (UniqueName: \"kubernetes.io/projected/b968ab81-8b5f-49c7-830b-220b90d6b1f1-kube-api-access-bzpb9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:46 crc kubenswrapper[4626]: I0223 07:07:46.742204 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:07:47 crc kubenswrapper[4626]: I0223 07:07:47.344272 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k"] Feb 23 07:07:48 crc kubenswrapper[4626]: I0223 07:07:48.332916 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" event={"ID":"b968ab81-8b5f-49c7-830b-220b90d6b1f1","Type":"ContainerStarted","Data":"7f628658310fe71c4082b009c32e46497a2de0382f76506b63ae6f544f4ce13b"} Feb 23 07:07:48 crc kubenswrapper[4626]: I0223 07:07:48.333419 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" event={"ID":"b968ab81-8b5f-49c7-830b-220b90d6b1f1","Type":"ContainerStarted","Data":"b9b168c5a48159ce734438c2f7d74895d28579b1525e65652bbb6ef586d75017"} Feb 23 07:07:48 crc kubenswrapper[4626]: I0223 07:07:48.352611 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" podStartSLOduration=1.825370771 podStartE2EDuration="2.352593053s" podCreationTimestamp="2026-02-23 07:07:46 +0000 UTC" firstStartedPulling="2026-02-23 07:07:47.362534855 +0000 UTC m=+1619.701864121" lastFinishedPulling="2026-02-23 07:07:47.889757137 +0000 UTC m=+1620.229086403" observedRunningTime="2026-02-23 07:07:48.344214064 +0000 UTC m=+1620.683543329" watchObservedRunningTime="2026-02-23 07:07:48.352593053 +0000 UTC m=+1620.691922319" Feb 23 07:07:52 crc kubenswrapper[4626]: I0223 07:07:52.984239 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:07:52 crc kubenswrapper[4626]: E0223 07:07:52.985393 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:08:07 crc kubenswrapper[4626]: I0223 07:08:07.989633 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:08:07 crc kubenswrapper[4626]: E0223 07:08:07.992234 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.049758 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-txmcg"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.056761 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5002-account-create-update-2j59l"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.065447 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-291c-account-create-update-6cvgc"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.070894 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5002-account-create-update-2j59l"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.075766 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rt5rb"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.080587 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-291c-account-create-update-6cvgc"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.086132 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rt5rb"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.090869 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-txmcg"] Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.994335 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59fa2739-f9e3-4007-9e1c-8f95cb92713e" path="/var/lib/kubelet/pods/59fa2739-f9e3-4007-9e1c-8f95cb92713e/volumes" Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.996259 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6" path="/var/lib/kubelet/pods/b0dcb83c-5b75-4b79-ba8f-9ac4464efaf6/volumes" Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.997802 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf8821f-f9db-4112-80cb-a85ecbd60c66" path="/var/lib/kubelet/pods/bbf8821f-f9db-4112-80cb-a85ecbd60c66/volumes" Feb 23 07:08:17 crc kubenswrapper[4626]: I0223 07:08:17.999776 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a" path="/var/lib/kubelet/pods/e49b8a5b-7a3e-4223-8cbb-4ffd9d40a75a/volumes" Feb 23 07:08:18 crc kubenswrapper[4626]: I0223 07:08:18.038854 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4bdb-account-create-update-2s5v2"] Feb 23 07:08:18 crc kubenswrapper[4626]: I0223 07:08:18.050971 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vvf5n"] Feb 23 07:08:18 crc kubenswrapper[4626]: I0223 07:08:18.061449 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4bdb-account-create-update-2s5v2"] Feb 23 07:08:18 crc kubenswrapper[4626]: I0223 07:08:18.067157 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vvf5n"] Feb 23 07:08:19 crc kubenswrapper[4626]: I0223 07:08:19.293562 4626 scope.go:117] "RemoveContainer" containerID="10b027b83b437b6c5981eb0d5984bab81043cd8bf1425df719420e75c22fa87e" Feb 23 07:08:19 crc kubenswrapper[4626]: I0223 07:08:19.331030 4626 scope.go:117] "RemoveContainer" containerID="1e0962286903e693d402f0eeda854adbaa71b06ddae8f2c468fe8523070e8f39" Feb 23 07:08:19 crc kubenswrapper[4626]: I0223 07:08:19.369926 4626 scope.go:117] "RemoveContainer" containerID="414cd5241cbb972da64432d6754407471801b969a757625d7f56b7fa2ebd5650" Feb 23 07:08:19 crc kubenswrapper[4626]: I0223 07:08:19.411910 4626 scope.go:117] "RemoveContainer" containerID="ff882c8ab5749639b1baa3892d8e22f4647306e1e844fc7420c59c344c3bf9e1" Feb 23 07:08:19 crc kubenswrapper[4626]: I0223 07:08:19.993916 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986e7467-8ed5-4f55-8518-e6c539b02c17" path="/var/lib/kubelet/pods/986e7467-8ed5-4f55-8518-e6c539b02c17/volumes" Feb 23 07:08:19 crc kubenswrapper[4626]: I0223 07:08:19.995721 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9" path="/var/lib/kubelet/pods/d351e8f0-0b2c-4e4a-a6ee-3540a45a2ac9/volumes" Feb 23 07:08:21 crc kubenswrapper[4626]: I0223 07:08:21.983041 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:08:21 crc kubenswrapper[4626]: E0223 07:08:21.983756 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:08:28 crc kubenswrapper[4626]: I0223 07:08:28.040132 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6k9g5"] Feb 23 07:08:28 crc kubenswrapper[4626]: I0223 07:08:28.056654 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6k9g5"] Feb 23 07:08:29 crc kubenswrapper[4626]: I0223 07:08:29.993850 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ff9fef-0642-4436-ac2d-d9a8b26f950e" path="/var/lib/kubelet/pods/30ff9fef-0642-4436-ac2d-d9a8b26f950e/volumes" Feb 23 07:08:32 crc kubenswrapper[4626]: I0223 07:08:32.982798 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:08:32 crc kubenswrapper[4626]: E0223 07:08:32.984566 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:08:43 crc kubenswrapper[4626]: I0223 07:08:43.045182 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9lq4g"] Feb 23 07:08:43 crc kubenswrapper[4626]: I0223 07:08:43.054736 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9lq4g"] Feb 23 07:08:43 crc kubenswrapper[4626]: I0223 07:08:43.982372 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:08:43 crc kubenswrapper[4626]: E0223 07:08:43.982977 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:08:43 crc kubenswrapper[4626]: I0223 07:08:43.994190 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421f011a-6b43-4b9d-9fa7-c293dc581234" path="/var/lib/kubelet/pods/421f011a-6b43-4b9d-9fa7-c293dc581234/volumes" Feb 23 07:08:54 crc kubenswrapper[4626]: I0223 07:08:54.046457 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-be2b-account-create-update-958v4"] Feb 23 07:08:54 crc kubenswrapper[4626]: I0223 07:08:54.058698 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-be2b-account-create-update-958v4"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.051541 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9fa0-account-create-update-nshxv"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.080725 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4vwh7"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.095373 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-hpjks"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.110159 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-25mlg"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.122990 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4vwh7"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.133918 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9fa0-account-create-update-nshxv"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.146425 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-hpjks"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.155006 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-25mlg"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.162240 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-pvhs6"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.172267 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-d699-account-create-update-4hp8b"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.183746 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-pvhs6"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.193037 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-d699-account-create-update-4hp8b"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.200611 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97a9-account-create-update-k7jvv"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.208155 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-97a9-account-create-update-k7jvv"] Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.993367 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a600db-8797-4c3e-98ee-d98d7daf59f9" path="/var/lib/kubelet/pods/13a600db-8797-4c3e-98ee-d98d7daf59f9/volumes" Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.995001 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b08b605-e241-47db-98f5-3b6051c589ee" path="/var/lib/kubelet/pods/3b08b605-e241-47db-98f5-3b6051c589ee/volumes" Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.996405 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48794fdc-dcd4-4e56-88ea-6628ef7b4b80" path="/var/lib/kubelet/pods/48794fdc-dcd4-4e56-88ea-6628ef7b4b80/volumes" Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.997447 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4002ff-ea4d-4c5d-a4da-793513d51e83" path="/var/lib/kubelet/pods/4c4002ff-ea4d-4c5d-a4da-793513d51e83/volumes" Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.999009 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e24e17-251e-4136-96cd-30d6a0fc3ee4" path="/var/lib/kubelet/pods/51e24e17-251e-4136-96cd-30d6a0fc3ee4/volumes" Feb 23 07:08:55 crc kubenswrapper[4626]: I0223 07:08:55.999860 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c64a62-5f00-430d-af04-03ec55d5029d" path="/var/lib/kubelet/pods/62c64a62-5f00-430d-af04-03ec55d5029d/volumes" Feb 23 07:08:56 crc kubenswrapper[4626]: I0223 07:08:56.000677 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44bbc0b-c334-4a3e-8a51-10394d82a253" path="/var/lib/kubelet/pods/c44bbc0b-c334-4a3e-8a51-10394d82a253/volumes" Feb 23 07:08:56 crc kubenswrapper[4626]: I0223 07:08:56.002191 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb7035a-577f-4291-b0dd-e4ee6e011018" path="/var/lib/kubelet/pods/eeb7035a-577f-4291-b0dd-e4ee6e011018/volumes" Feb 23 07:08:56 crc kubenswrapper[4626]: I0223 07:08:56.982172 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:08:56 crc kubenswrapper[4626]: E0223 07:08:56.982568 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:09:02 crc kubenswrapper[4626]: I0223 07:09:02.032363 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5hrgs"] Feb 23 07:09:02 crc kubenswrapper[4626]: I0223 07:09:02.040433 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5hrgs"] Feb 23 07:09:03 crc kubenswrapper[4626]: I0223 07:09:03.992066 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f759b0fe-50ac-4eb6-8539-c34dcf9cf501" path="/var/lib/kubelet/pods/f759b0fe-50ac-4eb6-8539-c34dcf9cf501/volumes" Feb 23 07:09:08 crc kubenswrapper[4626]: I0223 07:09:08.981968 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:09:08 crc kubenswrapper[4626]: E0223 07:09:08.983750 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.546988 4626 scope.go:117] "RemoveContainer" containerID="aeb9eefa3727ba91125e0fcde1f5155a3590ae339ab8bbca8134de8447fbef33" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.578040 4626 scope.go:117] "RemoveContainer" containerID="8be512aca6c69f2b206dc98db98e60f65e91a97d1accaff62890686b6521a231" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.618480 4626 scope.go:117] "RemoveContainer" containerID="8eaf17ab19ba278ff22cac801526ab596f6d7cf5023260078ab0a2da276d7106" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.649669 4626 scope.go:117] "RemoveContainer" containerID="bc0424fd284b479b719c4557b4b1273418ef13b422ddf1af6cc32fe49ff9e0c5" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.680962 4626 scope.go:117] "RemoveContainer" containerID="cefb121300a0e7a9bd8e74f118c263b7612f556b4ea99e6b50d61895b38954e6" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.715287 4626 scope.go:117] "RemoveContainer" containerID="83a1ca0d21c9a868af64d8bf98fc7001036bc86e1482b053dbd1ac3070a989be" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.743617 4626 scope.go:117] "RemoveContainer" containerID="5a10d71f4f20accc93d10a575735e6934c5a3db8378b2601d8eb0bfcb5298786" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.761997 4626 scope.go:117] "RemoveContainer" containerID="799ea3c4c27f9b76b968a94c0ac8a653eac2bc08dea4c8be759b7fe141e64e60" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.781166 4626 scope.go:117] "RemoveContainer" containerID="eec1e8e901634153f11a7a885ba0be5973fd9517d02f370a7f6ada4f19c7a8ca" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.799088 4626 scope.go:117] "RemoveContainer" containerID="296b5ead7448f29c39b2e2af1237718207ca790f3206d8ca64b163f689cee403" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.818485 4626 scope.go:117] "RemoveContainer" containerID="297990957fc057e6488e74b739fedf58792340d719618b5015b3c40259b71805" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.837951 4626 scope.go:117] "RemoveContainer" containerID="98c075d0d528adb9f1f6bc5433206ddd380210f877152227af5922736a573e3b" Feb 23 07:09:19 crc kubenswrapper[4626]: I0223 07:09:19.853895 4626 scope.go:117] "RemoveContainer" containerID="90f39ed5761bd663b510d80526abdec8fdc67d692b43aa71a0ac30d46daa28ee" Feb 23 07:09:23 crc kubenswrapper[4626]: I0223 07:09:23.983084 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:09:23 crc kubenswrapper[4626]: E0223 07:09:23.984646 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:09:29 crc kubenswrapper[4626]: I0223 07:09:29.045897 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jp5n9"] Feb 23 07:09:29 crc kubenswrapper[4626]: I0223 07:09:29.053384 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jp5n9"] Feb 23 07:09:29 crc kubenswrapper[4626]: I0223 07:09:29.995836 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f3630a-a5f4-4a54-91e3-e6764673beca" path="/var/lib/kubelet/pods/81f3630a-a5f4-4a54-91e3-e6764673beca/volumes" Feb 23 07:09:38 crc kubenswrapper[4626]: I0223 07:09:38.982543 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:09:38 crc kubenswrapper[4626]: E0223 07:09:38.983038 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:09:47 crc kubenswrapper[4626]: I0223 07:09:47.042562 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6x8cq"] Feb 23 07:09:47 crc kubenswrapper[4626]: I0223 07:09:47.052342 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6x8cq"] Feb 23 07:09:47 crc kubenswrapper[4626]: I0223 07:09:47.994680 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99407a1e-403e-4460-8bff-8eb644010b4c" path="/var/lib/kubelet/pods/99407a1e-403e-4460-8bff-8eb644010b4c/volumes" Feb 23 07:09:48 crc kubenswrapper[4626]: I0223 07:09:48.040619 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nxxgl"] Feb 23 07:09:48 crc kubenswrapper[4626]: I0223 07:09:48.050023 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7stdk"] Feb 23 07:09:48 crc kubenswrapper[4626]: I0223 07:09:48.057676 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7stdk"] Feb 23 07:09:48 crc kubenswrapper[4626]: I0223 07:09:48.064303 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nxxgl"] Feb 23 07:09:49 crc kubenswrapper[4626]: I0223 07:09:49.554831 4626 generic.go:334] "Generic (PLEG): container finished" podID="b968ab81-8b5f-49c7-830b-220b90d6b1f1" containerID="7f628658310fe71c4082b009c32e46497a2de0382f76506b63ae6f544f4ce13b" exitCode=0 Feb 23 07:09:49 crc kubenswrapper[4626]: I0223 07:09:49.554925 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" event={"ID":"b968ab81-8b5f-49c7-830b-220b90d6b1f1","Type":"ContainerDied","Data":"7f628658310fe71c4082b009c32e46497a2de0382f76506b63ae6f544f4ce13b"} Feb 23 07:09:49 crc kubenswrapper[4626]: I0223 07:09:49.981780 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:09:49 crc kubenswrapper[4626]: E0223 07:09:49.982066 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:09:49 crc kubenswrapper[4626]: I0223 07:09:49.992558 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498f5c1c-5f75-49e1-909a-e7ce904ebd9d" path="/var/lib/kubelet/pods/498f5c1c-5f75-49e1-909a-e7ce904ebd9d/volumes" Feb 23 07:09:49 crc kubenswrapper[4626]: I0223 07:09:49.994616 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a203dc9f-43a6-4cf4-ac68-7c5125053cba" path="/var/lib/kubelet/pods/a203dc9f-43a6-4cf4-ac68-7c5125053cba/volumes" Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.903130 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.939776 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-ssh-key-openstack-edpm-ipam\") pod \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.939959 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzpb9\" (UniqueName: \"kubernetes.io/projected/b968ab81-8b5f-49c7-830b-220b90d6b1f1-kube-api-access-bzpb9\") pod \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.940018 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-inventory\") pod \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\" (UID: \"b968ab81-8b5f-49c7-830b-220b90d6b1f1\") " Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.944920 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b968ab81-8b5f-49c7-830b-220b90d6b1f1-kube-api-access-bzpb9" (OuterVolumeSpecName: "kube-api-access-bzpb9") pod "b968ab81-8b5f-49c7-830b-220b90d6b1f1" (UID: "b968ab81-8b5f-49c7-830b-220b90d6b1f1"). InnerVolumeSpecName "kube-api-access-bzpb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.963897 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-inventory" (OuterVolumeSpecName: "inventory") pod "b968ab81-8b5f-49c7-830b-220b90d6b1f1" (UID: "b968ab81-8b5f-49c7-830b-220b90d6b1f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:50 crc kubenswrapper[4626]: I0223 07:09:50.972294 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b968ab81-8b5f-49c7-830b-220b90d6b1f1" (UID: "b968ab81-8b5f-49c7-830b-220b90d6b1f1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.042555 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzpb9\" (UniqueName: \"kubernetes.io/projected/b968ab81-8b5f-49c7-830b-220b90d6b1f1-kube-api-access-bzpb9\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.042589 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.042601 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b968ab81-8b5f-49c7-830b-220b90d6b1f1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.581588 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" event={"ID":"b968ab81-8b5f-49c7-830b-220b90d6b1f1","Type":"ContainerDied","Data":"b9b168c5a48159ce734438c2f7d74895d28579b1525e65652bbb6ef586d75017"} Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.581818 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9b168c5a48159ce734438c2f7d74895d28579b1525e65652bbb6ef586d75017" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.581638 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.655367 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn"] Feb 23 07:09:51 crc kubenswrapper[4626]: E0223 07:09:51.655756 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b968ab81-8b5f-49c7-830b-220b90d6b1f1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.655777 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b968ab81-8b5f-49c7-830b-220b90d6b1f1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.655995 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b968ab81-8b5f-49c7-830b-220b90d6b1f1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.656655 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.664455 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.664762 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.664813 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.665065 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.667816 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn"] Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.856697 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.856801 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.856956 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbs5\" (UniqueName: \"kubernetes.io/projected/17b6f47d-57a1-46e9-be66-1f93b98664c3-kube-api-access-cvbs5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.958879 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.958952 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.959026 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbs5\" (UniqueName: \"kubernetes.io/projected/17b6f47d-57a1-46e9-be66-1f93b98664c3-kube-api-access-cvbs5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.967184 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.970267 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:51 crc kubenswrapper[4626]: I0223 07:09:51.979064 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbs5\" (UniqueName: \"kubernetes.io/projected/17b6f47d-57a1-46e9-be66-1f93b98664c3-kube-api-access-cvbs5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:52 crc kubenswrapper[4626]: I0223 07:09:52.277055 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:09:52 crc kubenswrapper[4626]: I0223 07:09:52.751931 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn"] Feb 23 07:09:52 crc kubenswrapper[4626]: I0223 07:09:52.759282 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:09:53 crc kubenswrapper[4626]: I0223 07:09:53.610486 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" event={"ID":"17b6f47d-57a1-46e9-be66-1f93b98664c3","Type":"ContainerStarted","Data":"b83270d60129179c88072195f8b9abb08684055f2fe3ca3f9c12079e954e4608"} Feb 23 07:09:53 crc kubenswrapper[4626]: I0223 07:09:53.610990 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" event={"ID":"17b6f47d-57a1-46e9-be66-1f93b98664c3","Type":"ContainerStarted","Data":"e323ba0fcaafd6597d8e0e6e0c58176079824f9c18afec19f41646f37b5c031c"} Feb 23 07:09:53 crc kubenswrapper[4626]: I0223 07:09:53.638443 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" podStartSLOduration=2.088602694 podStartE2EDuration="2.638426562s" podCreationTimestamp="2026-02-23 07:09:51 +0000 UTC" firstStartedPulling="2026-02-23 07:09:52.759054242 +0000 UTC m=+1745.098383508" lastFinishedPulling="2026-02-23 07:09:53.30887811 +0000 UTC m=+1745.648207376" observedRunningTime="2026-02-23 07:09:53.62977086 +0000 UTC m=+1745.969100136" watchObservedRunningTime="2026-02-23 07:09:53.638426562 +0000 UTC m=+1745.977755828" Feb 23 07:10:03 crc kubenswrapper[4626]: I0223 07:10:03.982319 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:10:03 crc kubenswrapper[4626]: E0223 07:10:03.983390 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:10:09 crc kubenswrapper[4626]: I0223 07:10:09.050250 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-47npm"] Feb 23 07:10:09 crc kubenswrapper[4626]: I0223 07:10:09.058305 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-47npm"] Feb 23 07:10:09 crc kubenswrapper[4626]: I0223 07:10:09.994811 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc94ca-558e-4a2c-8d28-5aedbecb3090" path="/var/lib/kubelet/pods/c5cc94ca-558e-4a2c-8d28-5aedbecb3090/volumes" Feb 23 07:10:11 crc kubenswrapper[4626]: I0223 07:10:11.043673 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fttdm"] Feb 23 07:10:11 crc kubenswrapper[4626]: I0223 07:10:11.051060 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fttdm"] Feb 23 07:10:11 crc kubenswrapper[4626]: I0223 07:10:11.993481 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1806f1a-08dd-4b17-a799-1122348a4ab3" path="/var/lib/kubelet/pods/c1806f1a-08dd-4b17-a799-1122348a4ab3/volumes" Feb 23 07:10:17 crc kubenswrapper[4626]: I0223 07:10:17.988595 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:10:17 crc kubenswrapper[4626]: E0223 07:10:17.991470 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:10:20 crc kubenswrapper[4626]: I0223 07:10:20.080922 4626 scope.go:117] "RemoveContainer" containerID="165b5f3d9af6d08ff22d0934fda72534b4c88204b2d76378f1734acceafb7f84" Feb 23 07:10:20 crc kubenswrapper[4626]: I0223 07:10:20.104688 4626 scope.go:117] "RemoveContainer" containerID="a4c287ec73a0eec2d6f0a4c85df408793997ca1b2130c7115cb3bc92e74a7ac7" Feb 23 07:10:20 crc kubenswrapper[4626]: I0223 07:10:20.135559 4626 scope.go:117] "RemoveContainer" containerID="62d1aeac828dbe31c2b117686504cfca86362d4c695dde2b94f097502f9b946f" Feb 23 07:10:20 crc kubenswrapper[4626]: I0223 07:10:20.185929 4626 scope.go:117] "RemoveContainer" containerID="71904e160c42cf04472b0c2a1700ca0d85acc9141bf20632c450a5294151df1b" Feb 23 07:10:20 crc kubenswrapper[4626]: I0223 07:10:20.226531 4626 scope.go:117] "RemoveContainer" containerID="a57d9b1f428f07b68f9321332ba192416d73ec5cf8cd22ee06bda8881d2a494c" Feb 23 07:10:20 crc kubenswrapper[4626]: I0223 07:10:20.258191 4626 scope.go:117] "RemoveContainer" containerID="887e4fb3093570f30aaf87674069d3a964dda7b9537cf7d6ece01231c19c72aa" Feb 23 07:10:30 crc kubenswrapper[4626]: I0223 07:10:30.981789 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:10:30 crc kubenswrapper[4626]: E0223 07:10:30.982743 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:10:45 crc kubenswrapper[4626]: I0223 07:10:45.983538 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:10:45 crc kubenswrapper[4626]: E0223 07:10:45.984636 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:10:57 crc kubenswrapper[4626]: I0223 07:10:57.047593 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-c8mts"] Feb 23 07:10:57 crc kubenswrapper[4626]: I0223 07:10:57.059296 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-c8mts"] Feb 23 07:10:57 crc kubenswrapper[4626]: I0223 07:10:57.996929 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5" path="/var/lib/kubelet/pods/3665cb26-3a39-4ba7-8865-eb7ecc6a0fb5/volumes" Feb 23 07:10:58 crc kubenswrapper[4626]: I0223 07:10:58.034749 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-g4nht"] Feb 23 07:10:58 crc kubenswrapper[4626]: I0223 07:10:58.045565 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tpl9d"] Feb 23 07:10:58 crc kubenswrapper[4626]: I0223 07:10:58.053568 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tpl9d"] Feb 23 07:10:58 crc kubenswrapper[4626]: I0223 07:10:58.059167 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-g4nht"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.043903 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8656-account-create-update-4cfz9"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.054485 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9b56-account-create-update-gxhrg"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.060723 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9b56-account-create-update-gxhrg"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.066201 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8656-account-create-update-4cfz9"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.071122 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e978-account-create-update-5t98k"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.075898 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e978-account-create-update-5t98k"] Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.994481 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc4adbd-6bd0-4092-9a3f-59d17a09cb86" path="/var/lib/kubelet/pods/1dc4adbd-6bd0-4092-9a3f-59d17a09cb86/volumes" Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.995624 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e37d982-ad74-4cac-bd65-fc40c41f4dc5" path="/var/lib/kubelet/pods/1e37d982-ad74-4cac-bd65-fc40c41f4dc5/volumes" Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.996548 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31c479ac-ec69-4792-848a-20a6e6e92ee1" path="/var/lib/kubelet/pods/31c479ac-ec69-4792-848a-20a6e6e92ee1/volumes" Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.997141 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45aedbbc-efbd-4bf7-bbdb-f36992267beb" path="/var/lib/kubelet/pods/45aedbbc-efbd-4bf7-bbdb-f36992267beb/volumes" Feb 23 07:10:59 crc kubenswrapper[4626]: I0223 07:10:59.998683 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dc99b5-2935-4a73-aad4-ac3f687e6c9c" path="/var/lib/kubelet/pods/d5dc99b5-2935-4a73-aad4-ac3f687e6c9c/volumes" Feb 23 07:11:00 crc kubenswrapper[4626]: I0223 07:11:00.292813 4626 generic.go:334] "Generic (PLEG): container finished" podID="17b6f47d-57a1-46e9-be66-1f93b98664c3" containerID="b83270d60129179c88072195f8b9abb08684055f2fe3ca3f9c12079e954e4608" exitCode=0 Feb 23 07:11:00 crc kubenswrapper[4626]: I0223 07:11:00.292876 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" event={"ID":"17b6f47d-57a1-46e9-be66-1f93b98664c3","Type":"ContainerDied","Data":"b83270d60129179c88072195f8b9abb08684055f2fe3ca3f9c12079e954e4608"} Feb 23 07:11:00 crc kubenswrapper[4626]: I0223 07:11:00.982549 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:11:00 crc kubenswrapper[4626]: E0223 07:11:00.982895 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.664712 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.850255 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-inventory\") pod \"17b6f47d-57a1-46e9-be66-1f93b98664c3\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.850445 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-ssh-key-openstack-edpm-ipam\") pod \"17b6f47d-57a1-46e9-be66-1f93b98664c3\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.850481 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbs5\" (UniqueName: \"kubernetes.io/projected/17b6f47d-57a1-46e9-be66-1f93b98664c3-kube-api-access-cvbs5\") pod \"17b6f47d-57a1-46e9-be66-1f93b98664c3\" (UID: \"17b6f47d-57a1-46e9-be66-1f93b98664c3\") " Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.856510 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b6f47d-57a1-46e9-be66-1f93b98664c3-kube-api-access-cvbs5" (OuterVolumeSpecName: "kube-api-access-cvbs5") pod "17b6f47d-57a1-46e9-be66-1f93b98664c3" (UID: "17b6f47d-57a1-46e9-be66-1f93b98664c3"). InnerVolumeSpecName "kube-api-access-cvbs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.878434 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-inventory" (OuterVolumeSpecName: "inventory") pod "17b6f47d-57a1-46e9-be66-1f93b98664c3" (UID: "17b6f47d-57a1-46e9-be66-1f93b98664c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.878876 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "17b6f47d-57a1-46e9-be66-1f93b98664c3" (UID: "17b6f47d-57a1-46e9-be66-1f93b98664c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.953271 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.953318 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17b6f47d-57a1-46e9-be66-1f93b98664c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:01 crc kubenswrapper[4626]: I0223 07:11:01.953334 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbs5\" (UniqueName: \"kubernetes.io/projected/17b6f47d-57a1-46e9-be66-1f93b98664c3-kube-api-access-cvbs5\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.323195 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" event={"ID":"17b6f47d-57a1-46e9-be66-1f93b98664c3","Type":"ContainerDied","Data":"e323ba0fcaafd6597d8e0e6e0c58176079824f9c18afec19f41646f37b5c031c"} Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.323274 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e323ba0fcaafd6597d8e0e6e0c58176079824f9c18afec19f41646f37b5c031c" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.323440 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.397987 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4"] Feb 23 07:11:02 crc kubenswrapper[4626]: E0223 07:11:02.398403 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b6f47d-57a1-46e9-be66-1f93b98664c3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.398422 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6f47d-57a1-46e9-be66-1f93b98664c3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.398643 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b6f47d-57a1-46e9-be66-1f93b98664c3" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.399307 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.401315 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.401480 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.402589 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.402799 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.418988 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4"] Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.467044 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.467424 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfpvf\" (UniqueName: \"kubernetes.io/projected/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-kube-api-access-xfpvf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.467470 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.569891 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfpvf\" (UniqueName: \"kubernetes.io/projected/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-kube-api-access-xfpvf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.569940 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.570066 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.575675 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.578685 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.590810 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfpvf\" (UniqueName: \"kubernetes.io/projected/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-kube-api-access-xfpvf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:02 crc kubenswrapper[4626]: I0223 07:11:02.720667 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:03 crc kubenswrapper[4626]: I0223 07:11:03.207304 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4"] Feb 23 07:11:03 crc kubenswrapper[4626]: I0223 07:11:03.335524 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" event={"ID":"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000","Type":"ContainerStarted","Data":"1c55a1376aea74662a6bdf9d28a3e266b421b7505e7447ca80aca18951af6613"} Feb 23 07:11:04 crc kubenswrapper[4626]: I0223 07:11:04.357334 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" event={"ID":"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000","Type":"ContainerStarted","Data":"730474999faed82f70c836a0c01316a3761cbf9754f62db1f77110d27209c760"} Feb 23 07:11:08 crc kubenswrapper[4626]: I0223 07:11:08.398892 4626 generic.go:334] "Generic (PLEG): container finished" podID="8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" containerID="730474999faed82f70c836a0c01316a3761cbf9754f62db1f77110d27209c760" exitCode=0 Feb 23 07:11:08 crc kubenswrapper[4626]: I0223 07:11:08.398986 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" event={"ID":"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000","Type":"ContainerDied","Data":"730474999faed82f70c836a0c01316a3761cbf9754f62db1f77110d27209c760"} Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.776010 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.847634 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-ssh-key-openstack-edpm-ipam\") pod \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.847797 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-inventory\") pod \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.848102 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfpvf\" (UniqueName: \"kubernetes.io/projected/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-kube-api-access-xfpvf\") pod \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\" (UID: \"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000\") " Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.857543 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-kube-api-access-xfpvf" (OuterVolumeSpecName: "kube-api-access-xfpvf") pod "8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" (UID: "8d4a8ee4-c271-4fe2-b9ea-85a1a176a000"). InnerVolumeSpecName "kube-api-access-xfpvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.875847 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" (UID: "8d4a8ee4-c271-4fe2-b9ea-85a1a176a000"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.876955 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-inventory" (OuterVolumeSpecName: "inventory") pod "8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" (UID: "8d4a8ee4-c271-4fe2-b9ea-85a1a176a000"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.950892 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.950924 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfpvf\" (UniqueName: \"kubernetes.io/projected/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-kube-api-access-xfpvf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:09 crc kubenswrapper[4626]: I0223 07:11:09.950937 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4a8ee4-c271-4fe2-b9ea-85a1a176a000-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.425863 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" event={"ID":"8d4a8ee4-c271-4fe2-b9ea-85a1a176a000","Type":"ContainerDied","Data":"1c55a1376aea74662a6bdf9d28a3e266b421b7505e7447ca80aca18951af6613"} Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.425927 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c55a1376aea74662a6bdf9d28a3e266b421b7505e7447ca80aca18951af6613" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.426023 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.493016 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8"] Feb 23 07:11:10 crc kubenswrapper[4626]: E0223 07:11:10.493773 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.493972 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.499158 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4a8ee4-c271-4fe2-b9ea-85a1a176a000" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.500054 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.504598 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.506217 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8"] Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.506254 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.506341 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.510587 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.568409 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2jp\" (UniqueName: \"kubernetes.io/projected/f5ac8c56-2109-41c7-8129-5561016dbaef-kube-api-access-ws2jp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.568801 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.568916 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.671133 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2jp\" (UniqueName: \"kubernetes.io/projected/f5ac8c56-2109-41c7-8129-5561016dbaef-kube-api-access-ws2jp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.671447 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.671632 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.689736 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.689782 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.690272 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2jp\" (UniqueName: \"kubernetes.io/projected/f5ac8c56-2109-41c7-8129-5561016dbaef-kube-api-access-ws2jp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ch9l8\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:10 crc kubenswrapper[4626]: I0223 07:11:10.843697 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:11 crc kubenswrapper[4626]: I0223 07:11:11.356074 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8"] Feb 23 07:11:11 crc kubenswrapper[4626]: I0223 07:11:11.435046 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" event={"ID":"f5ac8c56-2109-41c7-8129-5561016dbaef","Type":"ContainerStarted","Data":"785720cee753403c762e942d4fc8351acae1a0eb0687bae6e8a27a328faecfee"} Feb 23 07:11:12 crc kubenswrapper[4626]: I0223 07:11:12.443644 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" event={"ID":"f5ac8c56-2109-41c7-8129-5561016dbaef","Type":"ContainerStarted","Data":"4439ec0b5c3d2eb92441eab42bc0922caefe01da3a4caa3e1ef5987358d9e63a"} Feb 23 07:11:12 crc kubenswrapper[4626]: I0223 07:11:12.467609 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" podStartSLOduration=1.951458124 podStartE2EDuration="2.467591153s" podCreationTimestamp="2026-02-23 07:11:10 +0000 UTC" firstStartedPulling="2026-02-23 07:11:11.363089516 +0000 UTC m=+1823.702418782" lastFinishedPulling="2026-02-23 07:11:11.879222545 +0000 UTC m=+1824.218551811" observedRunningTime="2026-02-23 07:11:12.460336483 +0000 UTC m=+1824.799665749" watchObservedRunningTime="2026-02-23 07:11:12.467591153 +0000 UTC m=+1824.806920419" Feb 23 07:11:13 crc kubenswrapper[4626]: I0223 07:11:13.982771 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:11:13 crc kubenswrapper[4626]: E0223 07:11:13.983490 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:11:20 crc kubenswrapper[4626]: I0223 07:11:20.404861 4626 scope.go:117] "RemoveContainer" containerID="d1d0db67d8f6d1719fb0fed314f5847de97023960d6f432e4a6272a4ae8aa88d" Feb 23 07:11:20 crc kubenswrapper[4626]: I0223 07:11:20.432057 4626 scope.go:117] "RemoveContainer" containerID="249c7803947ff2d7ef7677d7765267cd90c6832e69f6ae1cb298e0c245908d93" Feb 23 07:11:20 crc kubenswrapper[4626]: I0223 07:11:20.474136 4626 scope.go:117] "RemoveContainer" containerID="95eb52a0e57988e2c3636f5e12d7ec782f7c1266674e9d9dbeaabcdf5cfda961" Feb 23 07:11:20 crc kubenswrapper[4626]: I0223 07:11:20.511686 4626 scope.go:117] "RemoveContainer" containerID="3753ad5d94d4b119cfca356771f9b02e62b79b409ad5d5edc17146e84a4ed967" Feb 23 07:11:20 crc kubenswrapper[4626]: I0223 07:11:20.545677 4626 scope.go:117] "RemoveContainer" containerID="c457b95e7a4ecfa6eb5de8a1deb3c5e649f263f1853b47d66f706143fa3becd4" Feb 23 07:11:20 crc kubenswrapper[4626]: I0223 07:11:20.592457 4626 scope.go:117] "RemoveContainer" containerID="6df77acb04c766991ba70857651cdb2310fcb94c2e083fd9ac18e6d978a291f9" Feb 23 07:11:28 crc kubenswrapper[4626]: I0223 07:11:28.061189 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-w65pk"] Feb 23 07:11:28 crc kubenswrapper[4626]: I0223 07:11:28.076604 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-w65pk"] Feb 23 07:11:28 crc kubenswrapper[4626]: I0223 07:11:28.982271 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:11:28 crc kubenswrapper[4626]: E0223 07:11:28.983198 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:11:29 crc kubenswrapper[4626]: I0223 07:11:29.993428 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd63778-7b2b-4377-b4c3-62b2c15d17e5" path="/var/lib/kubelet/pods/3fd63778-7b2b-4377-b4c3-62b2c15d17e5/volumes" Feb 23 07:11:40 crc kubenswrapper[4626]: I0223 07:11:40.983340 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:11:40 crc kubenswrapper[4626]: E0223 07:11:40.983927 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:11:42 crc kubenswrapper[4626]: I0223 07:11:42.729006 4626 generic.go:334] "Generic (PLEG): container finished" podID="f5ac8c56-2109-41c7-8129-5561016dbaef" containerID="4439ec0b5c3d2eb92441eab42bc0922caefe01da3a4caa3e1ef5987358d9e63a" exitCode=0 Feb 23 07:11:42 crc kubenswrapper[4626]: I0223 07:11:42.729081 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" event={"ID":"f5ac8c56-2109-41c7-8129-5561016dbaef","Type":"ContainerDied","Data":"4439ec0b5c3d2eb92441eab42bc0922caefe01da3a4caa3e1ef5987358d9e63a"} Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.136297 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.187062 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-inventory\") pod \"f5ac8c56-2109-41c7-8129-5561016dbaef\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.187168 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-ssh-key-openstack-edpm-ipam\") pod \"f5ac8c56-2109-41c7-8129-5561016dbaef\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.187302 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws2jp\" (UniqueName: \"kubernetes.io/projected/f5ac8c56-2109-41c7-8129-5561016dbaef-kube-api-access-ws2jp\") pod \"f5ac8c56-2109-41c7-8129-5561016dbaef\" (UID: \"f5ac8c56-2109-41c7-8129-5561016dbaef\") " Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.198182 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ac8c56-2109-41c7-8129-5561016dbaef-kube-api-access-ws2jp" (OuterVolumeSpecName: "kube-api-access-ws2jp") pod "f5ac8c56-2109-41c7-8129-5561016dbaef" (UID: "f5ac8c56-2109-41c7-8129-5561016dbaef"). InnerVolumeSpecName "kube-api-access-ws2jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.214169 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5ac8c56-2109-41c7-8129-5561016dbaef" (UID: "f5ac8c56-2109-41c7-8129-5561016dbaef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.215110 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-inventory" (OuterVolumeSpecName: "inventory") pod "f5ac8c56-2109-41c7-8129-5561016dbaef" (UID: "f5ac8c56-2109-41c7-8129-5561016dbaef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.291645 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.291682 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5ac8c56-2109-41c7-8129-5561016dbaef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.291696 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws2jp\" (UniqueName: \"kubernetes.io/projected/f5ac8c56-2109-41c7-8129-5561016dbaef-kube-api-access-ws2jp\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.757738 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" event={"ID":"f5ac8c56-2109-41c7-8129-5561016dbaef","Type":"ContainerDied","Data":"785720cee753403c762e942d4fc8351acae1a0eb0687bae6e8a27a328faecfee"} Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.757796 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ch9l8" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.757802 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785720cee753403c762e942d4fc8351acae1a0eb0687bae6e8a27a328faecfee" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.946215 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd"] Feb 23 07:11:44 crc kubenswrapper[4626]: E0223 07:11:44.947880 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ac8c56-2109-41c7-8129-5561016dbaef" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.947913 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ac8c56-2109-41c7-8129-5561016dbaef" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.948271 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ac8c56-2109-41c7-8129-5561016dbaef" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.949154 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.952348 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.952694 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.952838 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.952867 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:11:44 crc kubenswrapper[4626]: I0223 07:11:44.960755 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd"] Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.005567 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.005867 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjnxh\" (UniqueName: \"kubernetes.io/projected/1ea04624-3b44-4b2b-b89d-7799440e264f-kube-api-access-hjnxh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.005908 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.108473 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjnxh\" (UniqueName: \"kubernetes.io/projected/1ea04624-3b44-4b2b-b89d-7799440e264f-kube-api-access-hjnxh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.108895 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.108981 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.114611 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.115798 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.127641 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjnxh\" (UniqueName: \"kubernetes.io/projected/1ea04624-3b44-4b2b-b89d-7799440e264f-kube-api-access-hjnxh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zckcd\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.273901 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:11:45 crc kubenswrapper[4626]: I0223 07:11:45.802212 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd"] Feb 23 07:11:46 crc kubenswrapper[4626]: I0223 07:11:46.777326 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" event={"ID":"1ea04624-3b44-4b2b-b89d-7799440e264f","Type":"ContainerStarted","Data":"abcf2f003568642bbe0eacf942cb31d61b93dbb603455b3da26075b3af0c8930"} Feb 23 07:11:46 crc kubenswrapper[4626]: I0223 07:11:46.777739 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" event={"ID":"1ea04624-3b44-4b2b-b89d-7799440e264f","Type":"ContainerStarted","Data":"6ad6e13158d79e483dc909ee07775447967ca4583e271e2b9d9bcbf77ef3c790"} Feb 23 07:11:46 crc kubenswrapper[4626]: I0223 07:11:46.802846 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" podStartSLOduration=2.315438949 podStartE2EDuration="2.802822248s" podCreationTimestamp="2026-02-23 07:11:44 +0000 UTC" firstStartedPulling="2026-02-23 07:11:45.810333355 +0000 UTC m=+1858.149662621" lastFinishedPulling="2026-02-23 07:11:46.297716653 +0000 UTC m=+1858.637045920" observedRunningTime="2026-02-23 07:11:46.793792362 +0000 UTC m=+1859.133121628" watchObservedRunningTime="2026-02-23 07:11:46.802822248 +0000 UTC m=+1859.142151515" Feb 23 07:11:51 crc kubenswrapper[4626]: I0223 07:11:51.054896 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pbxnw"] Feb 23 07:11:51 crc kubenswrapper[4626]: I0223 07:11:51.070302 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pbxnw"] Feb 23 07:11:51 crc kubenswrapper[4626]: I0223 07:11:51.083309 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-682fl"] Feb 23 07:11:51 crc kubenswrapper[4626]: I0223 07:11:51.091896 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-682fl"] Feb 23 07:11:51 crc kubenswrapper[4626]: I0223 07:11:51.991236 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2405cf95-00e4-40c0-bd99-266460b42580" path="/var/lib/kubelet/pods/2405cf95-00e4-40c0-bd99-266460b42580/volumes" Feb 23 07:11:51 crc kubenswrapper[4626]: I0223 07:11:51.991841 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35" path="/var/lib/kubelet/pods/d6d77bbd-9bcf-4b81-aa2b-a1f57ad60e35/volumes" Feb 23 07:11:52 crc kubenswrapper[4626]: I0223 07:11:52.983628 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:11:52 crc kubenswrapper[4626]: E0223 07:11:52.984317 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:12:05 crc kubenswrapper[4626]: I0223 07:12:05.982037 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:12:05 crc kubenswrapper[4626]: E0223 07:12:05.982867 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:12:17 crc kubenswrapper[4626]: I0223 07:12:17.992259 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:12:17 crc kubenswrapper[4626]: E0223 07:12:17.994962 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:12:20 crc kubenswrapper[4626]: I0223 07:12:20.716860 4626 scope.go:117] "RemoveContainer" containerID="de25619430f26966b0309fa56049de3497b6f1677868157e6ec74984e187cfed" Feb 23 07:12:20 crc kubenswrapper[4626]: I0223 07:12:20.752761 4626 scope.go:117] "RemoveContainer" containerID="40d143df27f38c86086e8f336b531fdac2ea0921b36da45ef4f48a1637d3e67d" Feb 23 07:12:20 crc kubenswrapper[4626]: I0223 07:12:20.779810 4626 scope.go:117] "RemoveContainer" containerID="378992167abee806d660f94752873bd86bfb062ec2479f143a977125d0c4c411" Feb 23 07:12:24 crc kubenswrapper[4626]: I0223 07:12:24.145428 4626 generic.go:334] "Generic (PLEG): container finished" podID="1ea04624-3b44-4b2b-b89d-7799440e264f" containerID="abcf2f003568642bbe0eacf942cb31d61b93dbb603455b3da26075b3af0c8930" exitCode=0 Feb 23 07:12:24 crc kubenswrapper[4626]: I0223 07:12:24.145555 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" event={"ID":"1ea04624-3b44-4b2b-b89d-7799440e264f","Type":"ContainerDied","Data":"abcf2f003568642bbe0eacf942cb31d61b93dbb603455b3da26075b3af0c8930"} Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.498717 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.607212 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-ssh-key-openstack-edpm-ipam\") pod \"1ea04624-3b44-4b2b-b89d-7799440e264f\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.607549 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjnxh\" (UniqueName: \"kubernetes.io/projected/1ea04624-3b44-4b2b-b89d-7799440e264f-kube-api-access-hjnxh\") pod \"1ea04624-3b44-4b2b-b89d-7799440e264f\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.607645 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-inventory\") pod \"1ea04624-3b44-4b2b-b89d-7799440e264f\" (UID: \"1ea04624-3b44-4b2b-b89d-7799440e264f\") " Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.614813 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea04624-3b44-4b2b-b89d-7799440e264f-kube-api-access-hjnxh" (OuterVolumeSpecName: "kube-api-access-hjnxh") pod "1ea04624-3b44-4b2b-b89d-7799440e264f" (UID: "1ea04624-3b44-4b2b-b89d-7799440e264f"). InnerVolumeSpecName "kube-api-access-hjnxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.631705 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1ea04624-3b44-4b2b-b89d-7799440e264f" (UID: "1ea04624-3b44-4b2b-b89d-7799440e264f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.635112 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-inventory" (OuterVolumeSpecName: "inventory") pod "1ea04624-3b44-4b2b-b89d-7799440e264f" (UID: "1ea04624-3b44-4b2b-b89d-7799440e264f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.718545 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.718587 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjnxh\" (UniqueName: \"kubernetes.io/projected/1ea04624-3b44-4b2b-b89d-7799440e264f-kube-api-access-hjnxh\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:25 crc kubenswrapper[4626]: I0223 07:12:25.718600 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ea04624-3b44-4b2b-b89d-7799440e264f-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.164778 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" event={"ID":"1ea04624-3b44-4b2b-b89d-7799440e264f","Type":"ContainerDied","Data":"6ad6e13158d79e483dc909ee07775447967ca4583e271e2b9d9bcbf77ef3c790"} Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.164828 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ad6e13158d79e483dc909ee07775447967ca4583e271e2b9d9bcbf77ef3c790" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.164906 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zckcd" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.259067 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swrn8"] Feb 23 07:12:26 crc kubenswrapper[4626]: E0223 07:12:26.259837 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea04624-3b44-4b2b-b89d-7799440e264f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.259936 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea04624-3b44-4b2b-b89d-7799440e264f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.260355 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea04624-3b44-4b2b-b89d-7799440e264f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.261288 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.265132 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.265310 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.266398 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.266616 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.270144 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swrn8"] Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.337410 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.337478 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.337573 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tfmd\" (UniqueName: \"kubernetes.io/projected/28fc407d-b22d-432c-b5c1-7fbb18142e65-kube-api-access-2tfmd\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.439768 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tfmd\" (UniqueName: \"kubernetes.io/projected/28fc407d-b22d-432c-b5c1-7fbb18142e65-kube-api-access-2tfmd\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.439999 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.440106 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.445188 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.447069 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.455136 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tfmd\" (UniqueName: \"kubernetes.io/projected/28fc407d-b22d-432c-b5c1-7fbb18142e65-kube-api-access-2tfmd\") pod \"ssh-known-hosts-edpm-deployment-swrn8\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:26 crc kubenswrapper[4626]: I0223 07:12:26.578151 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:27 crc kubenswrapper[4626]: I0223 07:12:27.116999 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swrn8"] Feb 23 07:12:27 crc kubenswrapper[4626]: I0223 07:12:27.174691 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" event={"ID":"28fc407d-b22d-432c-b5c1-7fbb18142e65","Type":"ContainerStarted","Data":"a3f119d47edc2b819d95032033f6a7ca588bd559db64847ad88248f1c563f746"} Feb 23 07:12:28 crc kubenswrapper[4626]: I0223 07:12:28.194337 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" event={"ID":"28fc407d-b22d-432c-b5c1-7fbb18142e65","Type":"ContainerStarted","Data":"fefbace2a002516f0e64e8a153225fd34bf95706cc9610cc9a0e54c229a48fec"} Feb 23 07:12:28 crc kubenswrapper[4626]: I0223 07:12:28.232104 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" podStartSLOduration=1.660171338 podStartE2EDuration="2.232082999s" podCreationTimestamp="2026-02-23 07:12:26 +0000 UTC" firstStartedPulling="2026-02-23 07:12:27.112071401 +0000 UTC m=+1899.451400668" lastFinishedPulling="2026-02-23 07:12:27.683983064 +0000 UTC m=+1900.023312329" observedRunningTime="2026-02-23 07:12:28.223305007 +0000 UTC m=+1900.562634273" watchObservedRunningTime="2026-02-23 07:12:28.232082999 +0000 UTC m=+1900.571412264" Feb 23 07:12:31 crc kubenswrapper[4626]: I0223 07:12:31.982820 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:12:32 crc kubenswrapper[4626]: I0223 07:12:32.235821 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"522c533173b6dc8610119c7e6504c043a2bd72039bef9f6e0109c726475aba01"} Feb 23 07:12:34 crc kubenswrapper[4626]: I0223 07:12:34.261161 4626 generic.go:334] "Generic (PLEG): container finished" podID="28fc407d-b22d-432c-b5c1-7fbb18142e65" containerID="fefbace2a002516f0e64e8a153225fd34bf95706cc9610cc9a0e54c229a48fec" exitCode=0 Feb 23 07:12:34 crc kubenswrapper[4626]: I0223 07:12:34.261354 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" event={"ID":"28fc407d-b22d-432c-b5c1-7fbb18142e65","Type":"ContainerDied","Data":"fefbace2a002516f0e64e8a153225fd34bf95706cc9610cc9a0e54c229a48fec"} Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.604541 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.674809 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-inventory-0\") pod \"28fc407d-b22d-432c-b5c1-7fbb18142e65\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.675178 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-ssh-key-openstack-edpm-ipam\") pod \"28fc407d-b22d-432c-b5c1-7fbb18142e65\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.675252 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tfmd\" (UniqueName: \"kubernetes.io/projected/28fc407d-b22d-432c-b5c1-7fbb18142e65-kube-api-access-2tfmd\") pod \"28fc407d-b22d-432c-b5c1-7fbb18142e65\" (UID: \"28fc407d-b22d-432c-b5c1-7fbb18142e65\") " Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.681333 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fc407d-b22d-432c-b5c1-7fbb18142e65-kube-api-access-2tfmd" (OuterVolumeSpecName: "kube-api-access-2tfmd") pod "28fc407d-b22d-432c-b5c1-7fbb18142e65" (UID: "28fc407d-b22d-432c-b5c1-7fbb18142e65"). InnerVolumeSpecName "kube-api-access-2tfmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.702478 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "28fc407d-b22d-432c-b5c1-7fbb18142e65" (UID: "28fc407d-b22d-432c-b5c1-7fbb18142e65"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.703380 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28fc407d-b22d-432c-b5c1-7fbb18142e65" (UID: "28fc407d-b22d-432c-b5c1-7fbb18142e65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.778896 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tfmd\" (UniqueName: \"kubernetes.io/projected/28fc407d-b22d-432c-b5c1-7fbb18142e65-kube-api-access-2tfmd\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.778930 4626 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:35 crc kubenswrapper[4626]: I0223 07:12:35.778949 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28fc407d-b22d-432c-b5c1-7fbb18142e65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.283808 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" event={"ID":"28fc407d-b22d-432c-b5c1-7fbb18142e65","Type":"ContainerDied","Data":"a3f119d47edc2b819d95032033f6a7ca588bd559db64847ad88248f1c563f746"} Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.283973 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3f119d47edc2b819d95032033f6a7ca588bd559db64847ad88248f1c563f746" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.283930 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swrn8" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.348704 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls"] Feb 23 07:12:36 crc kubenswrapper[4626]: E0223 07:12:36.349105 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fc407d-b22d-432c-b5c1-7fbb18142e65" containerName="ssh-known-hosts-edpm-deployment" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.349122 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fc407d-b22d-432c-b5c1-7fbb18142e65" containerName="ssh-known-hosts-edpm-deployment" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.349288 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fc407d-b22d-432c-b5c1-7fbb18142e65" containerName="ssh-known-hosts-edpm-deployment" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.349918 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.354411 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.354604 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.354454 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.354783 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.370653 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls"] Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.492886 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9fdk\" (UniqueName: \"kubernetes.io/projected/8e11e133-311c-4bd4-9989-a0e05f665f6a-kube-api-access-z9fdk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.493221 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.493318 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.596342 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9fdk\" (UniqueName: \"kubernetes.io/projected/8e11e133-311c-4bd4-9989-a0e05f665f6a-kube-api-access-z9fdk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.596583 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.596706 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.602248 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.603064 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.614804 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9fdk\" (UniqueName: \"kubernetes.io/projected/8e11e133-311c-4bd4-9989-a0e05f665f6a-kube-api-access-z9fdk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pxhls\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:36 crc kubenswrapper[4626]: I0223 07:12:36.665532 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:37 crc kubenswrapper[4626]: I0223 07:12:37.037911 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tz4lq"] Feb 23 07:12:37 crc kubenswrapper[4626]: I0223 07:12:37.049644 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tz4lq"] Feb 23 07:12:37 crc kubenswrapper[4626]: I0223 07:12:37.177945 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls"] Feb 23 07:12:37 crc kubenswrapper[4626]: I0223 07:12:37.294639 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" event={"ID":"8e11e133-311c-4bd4-9989-a0e05f665f6a","Type":"ContainerStarted","Data":"91eb3493876fe85662b83028ff2879e8d4136fe5a70ba19e59a7a333c82e73fc"} Feb 23 07:12:37 crc kubenswrapper[4626]: I0223 07:12:37.994107 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59" path="/var/lib/kubelet/pods/da2cfd29-c9ad-4018-86f0-ffeb1d7b8b59/volumes" Feb 23 07:12:38 crc kubenswrapper[4626]: I0223 07:12:38.306289 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" event={"ID":"8e11e133-311c-4bd4-9989-a0e05f665f6a","Type":"ContainerStarted","Data":"bcbf5e28eec145778aa2a882723895b088d08ec5623e6d76c58f24c836178f76"} Feb 23 07:12:38 crc kubenswrapper[4626]: I0223 07:12:38.332107 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" podStartSLOduration=1.7996740469999999 podStartE2EDuration="2.3320855s" podCreationTimestamp="2026-02-23 07:12:36 +0000 UTC" firstStartedPulling="2026-02-23 07:12:37.169306954 +0000 UTC m=+1909.508636220" lastFinishedPulling="2026-02-23 07:12:37.701718408 +0000 UTC m=+1910.041047673" observedRunningTime="2026-02-23 07:12:38.321339547 +0000 UTC m=+1910.660668813" watchObservedRunningTime="2026-02-23 07:12:38.3320855 +0000 UTC m=+1910.671414765" Feb 23 07:12:44 crc kubenswrapper[4626]: I0223 07:12:44.362304 4626 generic.go:334] "Generic (PLEG): container finished" podID="8e11e133-311c-4bd4-9989-a0e05f665f6a" containerID="bcbf5e28eec145778aa2a882723895b088d08ec5623e6d76c58f24c836178f76" exitCode=0 Feb 23 07:12:44 crc kubenswrapper[4626]: I0223 07:12:44.362393 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" event={"ID":"8e11e133-311c-4bd4-9989-a0e05f665f6a","Type":"ContainerDied","Data":"bcbf5e28eec145778aa2a882723895b088d08ec5623e6d76c58f24c836178f76"} Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.723640 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.825032 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-ssh-key-openstack-edpm-ipam\") pod \"8e11e133-311c-4bd4-9989-a0e05f665f6a\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.825203 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9fdk\" (UniqueName: \"kubernetes.io/projected/8e11e133-311c-4bd4-9989-a0e05f665f6a-kube-api-access-z9fdk\") pod \"8e11e133-311c-4bd4-9989-a0e05f665f6a\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.825429 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-inventory\") pod \"8e11e133-311c-4bd4-9989-a0e05f665f6a\" (UID: \"8e11e133-311c-4bd4-9989-a0e05f665f6a\") " Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.833035 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e11e133-311c-4bd4-9989-a0e05f665f6a-kube-api-access-z9fdk" (OuterVolumeSpecName: "kube-api-access-z9fdk") pod "8e11e133-311c-4bd4-9989-a0e05f665f6a" (UID: "8e11e133-311c-4bd4-9989-a0e05f665f6a"). InnerVolumeSpecName "kube-api-access-z9fdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.852656 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e11e133-311c-4bd4-9989-a0e05f665f6a" (UID: "8e11e133-311c-4bd4-9989-a0e05f665f6a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.856553 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-inventory" (OuterVolumeSpecName: "inventory") pod "8e11e133-311c-4bd4-9989-a0e05f665f6a" (UID: "8e11e133-311c-4bd4-9989-a0e05f665f6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.928074 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.928274 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9fdk\" (UniqueName: \"kubernetes.io/projected/8e11e133-311c-4bd4-9989-a0e05f665f6a-kube-api-access-z9fdk\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:45 crc kubenswrapper[4626]: I0223 07:12:45.928333 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e11e133-311c-4bd4-9989-a0e05f665f6a-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.383590 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" event={"ID":"8e11e133-311c-4bd4-9989-a0e05f665f6a","Type":"ContainerDied","Data":"91eb3493876fe85662b83028ff2879e8d4136fe5a70ba19e59a7a333c82e73fc"} Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.383658 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91eb3493876fe85662b83028ff2879e8d4136fe5a70ba19e59a7a333c82e73fc" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.383692 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pxhls" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.453580 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk"] Feb 23 07:12:46 crc kubenswrapper[4626]: E0223 07:12:46.454008 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e11e133-311c-4bd4-9989-a0e05f665f6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.454026 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e11e133-311c-4bd4-9989-a0e05f665f6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.454237 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e11e133-311c-4bd4-9989-a0e05f665f6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.454964 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.458604 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.458632 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.458800 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.459052 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.464570 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk"] Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.542543 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.542909 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.543070 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjnl\" (UniqueName: \"kubernetes.io/projected/80dffd5f-db5e-4946-9efe-8137bf36671f-kube-api-access-lrjnl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.645415 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.645548 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjnl\" (UniqueName: \"kubernetes.io/projected/80dffd5f-db5e-4946-9efe-8137bf36671f-kube-api-access-lrjnl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.645721 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.650929 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.651717 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.664302 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjnl\" (UniqueName: \"kubernetes.io/projected/80dffd5f-db5e-4946-9efe-8137bf36671f-kube-api-access-lrjnl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:46 crc kubenswrapper[4626]: I0223 07:12:46.768388 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:47 crc kubenswrapper[4626]: I0223 07:12:47.262079 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk"] Feb 23 07:12:47 crc kubenswrapper[4626]: I0223 07:12:47.397872 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" event={"ID":"80dffd5f-db5e-4946-9efe-8137bf36671f","Type":"ContainerStarted","Data":"2e2b1dbd12b1ea57885d6df8c33e95c341185e9feb3bbe1f2232bf5606e42a32"} Feb 23 07:12:48 crc kubenswrapper[4626]: I0223 07:12:48.407702 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" event={"ID":"80dffd5f-db5e-4946-9efe-8137bf36671f","Type":"ContainerStarted","Data":"0e92381f4a91fce4e022f41cbff3c6793d7fb8661f6d4011af1372c61cf2849f"} Feb 23 07:12:48 crc kubenswrapper[4626]: I0223 07:12:48.433767 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" podStartSLOduration=1.892070744 podStartE2EDuration="2.433728653s" podCreationTimestamp="2026-02-23 07:12:46 +0000 UTC" firstStartedPulling="2026-02-23 07:12:47.271560158 +0000 UTC m=+1919.610889423" lastFinishedPulling="2026-02-23 07:12:47.813218067 +0000 UTC m=+1920.152547332" observedRunningTime="2026-02-23 07:12:48.42156622 +0000 UTC m=+1920.760895486" watchObservedRunningTime="2026-02-23 07:12:48.433728653 +0000 UTC m=+1920.773057918" Feb 23 07:12:55 crc kubenswrapper[4626]: I0223 07:12:55.468264 4626 generic.go:334] "Generic (PLEG): container finished" podID="80dffd5f-db5e-4946-9efe-8137bf36671f" containerID="0e92381f4a91fce4e022f41cbff3c6793d7fb8661f6d4011af1372c61cf2849f" exitCode=0 Feb 23 07:12:55 crc kubenswrapper[4626]: I0223 07:12:55.468361 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" event={"ID":"80dffd5f-db5e-4946-9efe-8137bf36671f","Type":"ContainerDied","Data":"0e92381f4a91fce4e022f41cbff3c6793d7fb8661f6d4011af1372c61cf2849f"} Feb 23 07:12:56 crc kubenswrapper[4626]: I0223 07:12:56.846438 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:56 crc kubenswrapper[4626]: I0223 07:12:56.980584 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-ssh-key-openstack-edpm-ipam\") pod \"80dffd5f-db5e-4946-9efe-8137bf36671f\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " Feb 23 07:12:56 crc kubenswrapper[4626]: I0223 07:12:56.980805 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrjnl\" (UniqueName: \"kubernetes.io/projected/80dffd5f-db5e-4946-9efe-8137bf36671f-kube-api-access-lrjnl\") pod \"80dffd5f-db5e-4946-9efe-8137bf36671f\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " Feb 23 07:12:56 crc kubenswrapper[4626]: I0223 07:12:56.980963 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-inventory\") pod \"80dffd5f-db5e-4946-9efe-8137bf36671f\" (UID: \"80dffd5f-db5e-4946-9efe-8137bf36671f\") " Feb 23 07:12:56 crc kubenswrapper[4626]: I0223 07:12:56.987035 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80dffd5f-db5e-4946-9efe-8137bf36671f-kube-api-access-lrjnl" (OuterVolumeSpecName: "kube-api-access-lrjnl") pod "80dffd5f-db5e-4946-9efe-8137bf36671f" (UID: "80dffd5f-db5e-4946-9efe-8137bf36671f"). InnerVolumeSpecName "kube-api-access-lrjnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.007693 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-inventory" (OuterVolumeSpecName: "inventory") pod "80dffd5f-db5e-4946-9efe-8137bf36671f" (UID: "80dffd5f-db5e-4946-9efe-8137bf36671f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.007875 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80dffd5f-db5e-4946-9efe-8137bf36671f" (UID: "80dffd5f-db5e-4946-9efe-8137bf36671f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.085113 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.085145 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80dffd5f-db5e-4946-9efe-8137bf36671f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.085161 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrjnl\" (UniqueName: \"kubernetes.io/projected/80dffd5f-db5e-4946-9efe-8137bf36671f-kube-api-access-lrjnl\") on node \"crc\" DevicePath \"\"" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.491016 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" event={"ID":"80dffd5f-db5e-4946-9efe-8137bf36671f","Type":"ContainerDied","Data":"2e2b1dbd12b1ea57885d6df8c33e95c341185e9feb3bbe1f2232bf5606e42a32"} Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.491080 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2b1dbd12b1ea57885d6df8c33e95c341185e9feb3bbe1f2232bf5606e42a32" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.491135 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.578261 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8"] Feb 23 07:12:57 crc kubenswrapper[4626]: E0223 07:12:57.579144 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80dffd5f-db5e-4946-9efe-8137bf36671f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.579255 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="80dffd5f-db5e-4946-9efe-8137bf36671f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.579564 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="80dffd5f-db5e-4946-9efe-8137bf36671f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.580532 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.582852 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.594662 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.594778 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.594835 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.594843 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.595127 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.595247 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.595453 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.599347 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8"] Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699237 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699302 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699337 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699364 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699422 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699549 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.699610 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.700058 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.700591 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.700871 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.700968 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.701018 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.701305 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.701394 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742rk\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-kube-api-access-742rk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803209 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803283 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803312 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803368 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803423 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803471 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803523 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803551 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803581 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803612 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742rk\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-kube-api-access-742rk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803716 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803753 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803775 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.803791 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.808421 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.809788 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.810070 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.810655 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.812243 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.818890 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.820241 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.821892 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742rk\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-kube-api-access-742rk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.822758 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.822779 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.823267 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.823706 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.823830 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.825221 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:57 crc kubenswrapper[4626]: I0223 07:12:57.902157 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:12:58 crc kubenswrapper[4626]: I0223 07:12:58.453513 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8"] Feb 23 07:12:58 crc kubenswrapper[4626]: I0223 07:12:58.500311 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" event={"ID":"3d1fa218-95df-487b-b4d0-be0da8e72c58","Type":"ContainerStarted","Data":"40d43e5a89347a69abcc306725bb81f9784436c257990cc5cb911323a857bfd4"} Feb 23 07:12:59 crc kubenswrapper[4626]: I0223 07:12:59.514354 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" event={"ID":"3d1fa218-95df-487b-b4d0-be0da8e72c58","Type":"ContainerStarted","Data":"0ee5aad02f84e313ad82e22349adc5d83f45e69a1546b2c04cc4233b78e5623f"} Feb 23 07:12:59 crc kubenswrapper[4626]: I0223 07:12:59.531660 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" podStartSLOduration=2.014158721 podStartE2EDuration="2.531644381s" podCreationTimestamp="2026-02-23 07:12:57 +0000 UTC" firstStartedPulling="2026-02-23 07:12:58.472121917 +0000 UTC m=+1930.811451184" lastFinishedPulling="2026-02-23 07:12:58.989607578 +0000 UTC m=+1931.328936844" observedRunningTime="2026-02-23 07:12:59.528855452 +0000 UTC m=+1931.868184718" watchObservedRunningTime="2026-02-23 07:12:59.531644381 +0000 UTC m=+1931.870973646" Feb 23 07:13:20 crc kubenswrapper[4626]: I0223 07:13:20.863187 4626 scope.go:117] "RemoveContainer" containerID="3a8427c9ef8b10ef11481a165e8a99279a2aa7a0bd65951b9a942131c66f8cf3" Feb 23 07:13:27 crc kubenswrapper[4626]: I0223 07:13:27.820313 4626 generic.go:334] "Generic (PLEG): container finished" podID="3d1fa218-95df-487b-b4d0-be0da8e72c58" containerID="0ee5aad02f84e313ad82e22349adc5d83f45e69a1546b2c04cc4233b78e5623f" exitCode=0 Feb 23 07:13:27 crc kubenswrapper[4626]: I0223 07:13:27.820423 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" event={"ID":"3d1fa218-95df-487b-b4d0-be0da8e72c58","Type":"ContainerDied","Data":"0ee5aad02f84e313ad82e22349adc5d83f45e69a1546b2c04cc4233b78e5623f"} Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.260168 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319346 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-libvirt-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319462 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-repo-setup-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319564 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-inventory\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319644 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ovn-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319755 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319780 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ssh-key-openstack-edpm-ipam\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.319835 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320006 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-telemetry-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320088 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320126 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-ovn-default-certs-0\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320181 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-742rk\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-kube-api-access-742rk\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320205 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-bootstrap-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320347 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-neutron-metadata-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.320385 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-nova-combined-ca-bundle\") pod \"3d1fa218-95df-487b-b4d0-be0da8e72c58\" (UID: \"3d1fa218-95df-487b-b4d0-be0da8e72c58\") " Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.331146 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.333861 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.336697 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-kube-api-access-742rk" (OuterVolumeSpecName: "kube-api-access-742rk") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "kube-api-access-742rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.337216 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.337959 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.338090 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.338274 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.340000 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.340586 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.341773 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.343689 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.345081 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.360581 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.363342 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-inventory" (OuterVolumeSpecName: "inventory") pod "3d1fa218-95df-487b-b4d0-be0da8e72c58" (UID: "3d1fa218-95df-487b-b4d0-be0da8e72c58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.426980 4626 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427005 4626 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427018 4626 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427030 4626 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427045 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427054 4626 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427065 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427074 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427085 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427093 4626 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427102 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427111 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427127 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-742rk\" (UniqueName: \"kubernetes.io/projected/3d1fa218-95df-487b-b4d0-be0da8e72c58-kube-api-access-742rk\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.427138 4626 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1fa218-95df-487b-b4d0-be0da8e72c58-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.854880 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" event={"ID":"3d1fa218-95df-487b-b4d0-be0da8e72c58","Type":"ContainerDied","Data":"40d43e5a89347a69abcc306725bb81f9784436c257990cc5cb911323a857bfd4"} Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.855284 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d43e5a89347a69abcc306725bb81f9784436c257990cc5cb911323a857bfd4" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.854977 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.947200 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h"] Feb 23 07:13:29 crc kubenswrapper[4626]: E0223 07:13:29.947617 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1fa218-95df-487b-b4d0-be0da8e72c58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.947637 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1fa218-95df-487b-b4d0-be0da8e72c58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.947793 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1fa218-95df-487b-b4d0-be0da8e72c58" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.948452 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.950336 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.950621 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.951121 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.951245 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.959782 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:13:29 crc kubenswrapper[4626]: I0223 07:13:29.960253 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h"] Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.040468 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97c11447-7070-4233-aaf2-7661d687049d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.040574 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.040656 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.040775 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjqh\" (UniqueName: \"kubernetes.io/projected/97c11447-7070-4233-aaf2-7661d687049d-kube-api-access-7mjqh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.040943 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.143700 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.143820 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjqh\" (UniqueName: \"kubernetes.io/projected/97c11447-7070-4233-aaf2-7661d687049d-kube-api-access-7mjqh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.144187 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.144989 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97c11447-7070-4233-aaf2-7661d687049d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.145099 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.146589 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97c11447-7070-4233-aaf2-7661d687049d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.149275 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.151980 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.152471 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.161413 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjqh\" (UniqueName: \"kubernetes.io/projected/97c11447-7070-4233-aaf2-7661d687049d-kube-api-access-7mjqh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-szt5h\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.263382 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.776154 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h"] Feb 23 07:13:30 crc kubenswrapper[4626]: I0223 07:13:30.863301 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" event={"ID":"97c11447-7070-4233-aaf2-7661d687049d","Type":"ContainerStarted","Data":"a5e2d832f86f77cfe4381980d7a0282eda0e9a053028b9f63583a7df04dd72f4"} Feb 23 07:13:31 crc kubenswrapper[4626]: I0223 07:13:31.875806 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" event={"ID":"97c11447-7070-4233-aaf2-7661d687049d","Type":"ContainerStarted","Data":"05299d84da1af103e1cc16effec5c940826e8ea9eabff71be5b494c768cb6d73"} Feb 23 07:13:31 crc kubenswrapper[4626]: I0223 07:13:31.894682 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" podStartSLOduration=2.271715042 podStartE2EDuration="2.894657153s" podCreationTimestamp="2026-02-23 07:13:29 +0000 UTC" firstStartedPulling="2026-02-23 07:13:30.787310046 +0000 UTC m=+1963.126639312" lastFinishedPulling="2026-02-23 07:13:31.410252157 +0000 UTC m=+1963.749581423" observedRunningTime="2026-02-23 07:13:31.891665813 +0000 UTC m=+1964.230995080" watchObservedRunningTime="2026-02-23 07:13:31.894657153 +0000 UTC m=+1964.233986419" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.327579 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwpc4"] Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.336709 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.347232 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwpc4"] Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.380286 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-utilities\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.380597 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sfv4\" (UniqueName: \"kubernetes.io/projected/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-kube-api-access-4sfv4\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.384376 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-catalog-content\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.486328 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-catalog-content\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.486407 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-utilities\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.486458 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sfv4\" (UniqueName: \"kubernetes.io/projected/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-kube-api-access-4sfv4\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.487060 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-catalog-content\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.487285 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-utilities\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.503031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sfv4\" (UniqueName: \"kubernetes.io/projected/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-kube-api-access-4sfv4\") pod \"certified-operators-gwpc4\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:02 crc kubenswrapper[4626]: I0223 07:14:02.660493 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:03 crc kubenswrapper[4626]: I0223 07:14:03.200882 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwpc4"] Feb 23 07:14:04 crc kubenswrapper[4626]: I0223 07:14:04.236879 4626 generic.go:334] "Generic (PLEG): container finished" podID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerID="cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383" exitCode=0 Feb 23 07:14:04 crc kubenswrapper[4626]: I0223 07:14:04.236986 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerDied","Data":"cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383"} Feb 23 07:14:04 crc kubenswrapper[4626]: I0223 07:14:04.237191 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerStarted","Data":"bed292c3de8f8adb6c0263e80b1be98f55fc236d3c5642b90a237387e7a3963d"} Feb 23 07:14:05 crc kubenswrapper[4626]: I0223 07:14:05.251464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerStarted","Data":"669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7"} Feb 23 07:14:06 crc kubenswrapper[4626]: I0223 07:14:06.264745 4626 generic.go:334] "Generic (PLEG): container finished" podID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerID="669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7" exitCode=0 Feb 23 07:14:06 crc kubenswrapper[4626]: I0223 07:14:06.264856 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerDied","Data":"669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7"} Feb 23 07:14:07 crc kubenswrapper[4626]: I0223 07:14:07.281589 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerStarted","Data":"f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9"} Feb 23 07:14:07 crc kubenswrapper[4626]: I0223 07:14:07.312530 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwpc4" podStartSLOduration=2.736317792 podStartE2EDuration="5.312510931s" podCreationTimestamp="2026-02-23 07:14:02 +0000 UTC" firstStartedPulling="2026-02-23 07:14:04.238332429 +0000 UTC m=+1996.577661695" lastFinishedPulling="2026-02-23 07:14:06.814525568 +0000 UTC m=+1999.153854834" observedRunningTime="2026-02-23 07:14:07.301605258 +0000 UTC m=+1999.640934524" watchObservedRunningTime="2026-02-23 07:14:07.312510931 +0000 UTC m=+1999.651840187" Feb 23 07:14:12 crc kubenswrapper[4626]: I0223 07:14:12.660798 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:12 crc kubenswrapper[4626]: I0223 07:14:12.662209 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:12 crc kubenswrapper[4626]: I0223 07:14:12.710687 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:13 crc kubenswrapper[4626]: I0223 07:14:13.377258 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:13 crc kubenswrapper[4626]: I0223 07:14:13.430800 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwpc4"] Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.356062 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwpc4" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="registry-server" containerID="cri-o://f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9" gracePeriod=2 Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.788345 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.930344 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-utilities\") pod \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.930667 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sfv4\" (UniqueName: \"kubernetes.io/projected/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-kube-api-access-4sfv4\") pod \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.930724 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-catalog-content\") pod \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\" (UID: \"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0\") " Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.931053 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-utilities" (OuterVolumeSpecName: "utilities") pod "20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" (UID: "20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.932690 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.937742 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-kube-api-access-4sfv4" (OuterVolumeSpecName: "kube-api-access-4sfv4") pod "20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" (UID: "20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0"). InnerVolumeSpecName "kube-api-access-4sfv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:14:15 crc kubenswrapper[4626]: I0223 07:14:15.973828 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" (UID: "20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.033811 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sfv4\" (UniqueName: \"kubernetes.io/projected/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-kube-api-access-4sfv4\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.033837 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.366805 4626 generic.go:334] "Generic (PLEG): container finished" podID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerID="f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9" exitCode=0 Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.366851 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerDied","Data":"f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9"} Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.366881 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpc4" event={"ID":"20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0","Type":"ContainerDied","Data":"bed292c3de8f8adb6c0263e80b1be98f55fc236d3c5642b90a237387e7a3963d"} Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.366900 4626 scope.go:117] "RemoveContainer" containerID="f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.367035 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpc4" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.391683 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwpc4"] Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.400722 4626 scope.go:117] "RemoveContainer" containerID="669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.400981 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwpc4"] Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.434854 4626 scope.go:117] "RemoveContainer" containerID="cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.457260 4626 scope.go:117] "RemoveContainer" containerID="f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9" Feb 23 07:14:16 crc kubenswrapper[4626]: E0223 07:14:16.457674 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9\": container with ID starting with f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9 not found: ID does not exist" containerID="f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.457706 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9"} err="failed to get container status \"f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9\": rpc error: code = NotFound desc = could not find container \"f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9\": container with ID starting with f20ed8caaf9d00bb7f2e2115b0e2d473b7c6f2393ec8bd276a7f94c9c73f88f9 not found: ID does not exist" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.457729 4626 scope.go:117] "RemoveContainer" containerID="669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7" Feb 23 07:14:16 crc kubenswrapper[4626]: E0223 07:14:16.457981 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7\": container with ID starting with 669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7 not found: ID does not exist" containerID="669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.458004 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7"} err="failed to get container status \"669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7\": rpc error: code = NotFound desc = could not find container \"669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7\": container with ID starting with 669776ed802af9045ac7777a58e941ddd1092b2dd08467a7c441e5553e7f77a7 not found: ID does not exist" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.458018 4626 scope.go:117] "RemoveContainer" containerID="cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383" Feb 23 07:14:16 crc kubenswrapper[4626]: E0223 07:14:16.458258 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383\": container with ID starting with cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383 not found: ID does not exist" containerID="cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383" Feb 23 07:14:16 crc kubenswrapper[4626]: I0223 07:14:16.458280 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383"} err="failed to get container status \"cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383\": rpc error: code = NotFound desc = could not find container \"cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383\": container with ID starting with cdf2178a212a736ca66b800976b65cfac48428b6e7c326683b79b23216e84383 not found: ID does not exist" Feb 23 07:14:17 crc kubenswrapper[4626]: I0223 07:14:17.993159 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" path="/var/lib/kubelet/pods/20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0/volumes" Feb 23 07:14:22 crc kubenswrapper[4626]: I0223 07:14:22.432928 4626 generic.go:334] "Generic (PLEG): container finished" podID="97c11447-7070-4233-aaf2-7661d687049d" containerID="05299d84da1af103e1cc16effec5c940826e8ea9eabff71be5b494c768cb6d73" exitCode=0 Feb 23 07:14:22 crc kubenswrapper[4626]: I0223 07:14:22.433006 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" event={"ID":"97c11447-7070-4233-aaf2-7661d687049d","Type":"ContainerDied","Data":"05299d84da1af103e1cc16effec5c940826e8ea9eabff71be5b494c768cb6d73"} Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.866440 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.923022 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ssh-key-openstack-edpm-ipam\") pod \"97c11447-7070-4233-aaf2-7661d687049d\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.923115 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mjqh\" (UniqueName: \"kubernetes.io/projected/97c11447-7070-4233-aaf2-7661d687049d-kube-api-access-7mjqh\") pod \"97c11447-7070-4233-aaf2-7661d687049d\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.923150 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-inventory\") pod \"97c11447-7070-4233-aaf2-7661d687049d\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.923470 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ovn-combined-ca-bundle\") pod \"97c11447-7070-4233-aaf2-7661d687049d\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.923597 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97c11447-7070-4233-aaf2-7661d687049d-ovncontroller-config-0\") pod \"97c11447-7070-4233-aaf2-7661d687049d\" (UID: \"97c11447-7070-4233-aaf2-7661d687049d\") " Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.940415 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "97c11447-7070-4233-aaf2-7661d687049d" (UID: "97c11447-7070-4233-aaf2-7661d687049d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.941526 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c11447-7070-4233-aaf2-7661d687049d-kube-api-access-7mjqh" (OuterVolumeSpecName: "kube-api-access-7mjqh") pod "97c11447-7070-4233-aaf2-7661d687049d" (UID: "97c11447-7070-4233-aaf2-7661d687049d"). InnerVolumeSpecName "kube-api-access-7mjqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.950932 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c11447-7070-4233-aaf2-7661d687049d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "97c11447-7070-4233-aaf2-7661d687049d" (UID: "97c11447-7070-4233-aaf2-7661d687049d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.953617 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "97c11447-7070-4233-aaf2-7661d687049d" (UID: "97c11447-7070-4233-aaf2-7661d687049d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:14:23 crc kubenswrapper[4626]: I0223 07:14:23.954481 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-inventory" (OuterVolumeSpecName: "inventory") pod "97c11447-7070-4233-aaf2-7661d687049d" (UID: "97c11447-7070-4233-aaf2-7661d687049d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.031493 4626 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.031915 4626 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/97c11447-7070-4233-aaf2-7661d687049d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.031943 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.031955 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mjqh\" (UniqueName: \"kubernetes.io/projected/97c11447-7070-4233-aaf2-7661d687049d-kube-api-access-7mjqh\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.031966 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97c11447-7070-4233-aaf2-7661d687049d-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.455926 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" event={"ID":"97c11447-7070-4233-aaf2-7661d687049d","Type":"ContainerDied","Data":"a5e2d832f86f77cfe4381980d7a0282eda0e9a053028b9f63583a7df04dd72f4"} Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.455983 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5e2d832f86f77cfe4381980d7a0282eda0e9a053028b9f63583a7df04dd72f4" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.456016 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-szt5h" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.563896 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb"] Feb 23 07:14:24 crc kubenswrapper[4626]: E0223 07:14:24.564424 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="registry-server" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.564448 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="registry-server" Feb 23 07:14:24 crc kubenswrapper[4626]: E0223 07:14:24.564457 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c11447-7070-4233-aaf2-7661d687049d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.564464 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c11447-7070-4233-aaf2-7661d687049d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 23 07:14:24 crc kubenswrapper[4626]: E0223 07:14:24.564474 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="extract-utilities" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.564481 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="extract-utilities" Feb 23 07:14:24 crc kubenswrapper[4626]: E0223 07:14:24.564515 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="extract-content" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.564520 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="extract-content" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.564754 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="20da9bfa-49ea-4bd3-88e0-e6ec06e0ece0" containerName="registry-server" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.564784 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c11447-7070-4233-aaf2-7661d687049d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.565598 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.567990 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.568368 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.568576 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.568723 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.568728 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.576233 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb"] Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.577047 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.745595 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.745961 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.746091 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8swxh\" (UniqueName: \"kubernetes.io/projected/671613e8-e8c1-40e7-86bf-026acd3864fe-kube-api-access-8swxh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.746190 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.746393 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.746535 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.849545 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.849755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.849793 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8swxh\" (UniqueName: \"kubernetes.io/projected/671613e8-e8c1-40e7-86bf-026acd3864fe-kube-api-access-8swxh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.849846 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.849956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.850038 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.863172 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.863240 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.863308 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.863364 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.864024 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.866993 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8swxh\" (UniqueName: \"kubernetes.io/projected/671613e8-e8c1-40e7-86bf-026acd3864fe-kube-api-access-8swxh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:24 crc kubenswrapper[4626]: I0223 07:14:24.888472 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.519280 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb"] Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.738445 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jh7mt"] Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.740824 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.754082 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jh7mt"] Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.780301 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9644g\" (UniqueName: \"kubernetes.io/projected/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-kube-api-access-9644g\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.780362 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-utilities\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.780397 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-catalog-content\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.883005 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9644g\" (UniqueName: \"kubernetes.io/projected/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-kube-api-access-9644g\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.883488 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-utilities\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.883533 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-catalog-content\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.884104 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-catalog-content\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.884230 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-utilities\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:25 crc kubenswrapper[4626]: I0223 07:14:25.903667 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9644g\" (UniqueName: \"kubernetes.io/projected/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-kube-api-access-9644g\") pod \"redhat-operators-jh7mt\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:26 crc kubenswrapper[4626]: I0223 07:14:26.063759 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:26 crc kubenswrapper[4626]: I0223 07:14:26.483358 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" event={"ID":"671613e8-e8c1-40e7-86bf-026acd3864fe","Type":"ContainerStarted","Data":"ea11f8d2a78c736cdef1285be7f546b63a0a56de8a4660b14603f9af2ae055bd"} Feb 23 07:14:26 crc kubenswrapper[4626]: I0223 07:14:26.483778 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" event={"ID":"671613e8-e8c1-40e7-86bf-026acd3864fe","Type":"ContainerStarted","Data":"eff9e26ba0c1476f73a73703c70df5870de6a3b1acb401972bd595989ed1e58c"} Feb 23 07:14:26 crc kubenswrapper[4626]: I0223 07:14:26.515679 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" podStartSLOduration=1.970334724 podStartE2EDuration="2.515659616s" podCreationTimestamp="2026-02-23 07:14:24 +0000 UTC" firstStartedPulling="2026-02-23 07:14:25.514848184 +0000 UTC m=+2017.854177450" lastFinishedPulling="2026-02-23 07:14:26.060173076 +0000 UTC m=+2018.399502342" observedRunningTime="2026-02-23 07:14:26.501608061 +0000 UTC m=+2018.840937328" watchObservedRunningTime="2026-02-23 07:14:26.515659616 +0000 UTC m=+2018.854988883" Feb 23 07:14:26 crc kubenswrapper[4626]: I0223 07:14:26.564361 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jh7mt"] Feb 23 07:14:27 crc kubenswrapper[4626]: I0223 07:14:27.501549 4626 generic.go:334] "Generic (PLEG): container finished" podID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerID="d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028" exitCode=0 Feb 23 07:14:27 crc kubenswrapper[4626]: I0223 07:14:27.501649 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerDied","Data":"d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028"} Feb 23 07:14:27 crc kubenswrapper[4626]: I0223 07:14:27.502533 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerStarted","Data":"6c8adae1783cf394b886d5c9f4e865087a12ff90cb76b57e49e8a5e6de89a940"} Feb 23 07:14:28 crc kubenswrapper[4626]: I0223 07:14:28.518411 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerStarted","Data":"c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2"} Feb 23 07:14:31 crc kubenswrapper[4626]: I0223 07:14:31.546240 4626 generic.go:334] "Generic (PLEG): container finished" podID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerID="c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2" exitCode=0 Feb 23 07:14:31 crc kubenswrapper[4626]: I0223 07:14:31.546342 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerDied","Data":"c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2"} Feb 23 07:14:32 crc kubenswrapper[4626]: I0223 07:14:32.571892 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerStarted","Data":"0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299"} Feb 23 07:14:32 crc kubenswrapper[4626]: I0223 07:14:32.595565 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jh7mt" podStartSLOduration=3.096608261 podStartE2EDuration="7.595547691s" podCreationTimestamp="2026-02-23 07:14:25 +0000 UTC" firstStartedPulling="2026-02-23 07:14:27.5054137 +0000 UTC m=+2019.844742966" lastFinishedPulling="2026-02-23 07:14:32.00435313 +0000 UTC m=+2024.343682396" observedRunningTime="2026-02-23 07:14:32.594456854 +0000 UTC m=+2024.933786120" watchObservedRunningTime="2026-02-23 07:14:32.595547691 +0000 UTC m=+2024.934876956" Feb 23 07:14:36 crc kubenswrapper[4626]: I0223 07:14:36.064644 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:36 crc kubenswrapper[4626]: I0223 07:14:36.066017 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:37 crc kubenswrapper[4626]: I0223 07:14:37.102969 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jh7mt" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="registry-server" probeResult="failure" output=< Feb 23 07:14:37 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:14:37 crc kubenswrapper[4626]: > Feb 23 07:14:46 crc kubenswrapper[4626]: I0223 07:14:46.106714 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:46 crc kubenswrapper[4626]: I0223 07:14:46.151757 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:46 crc kubenswrapper[4626]: I0223 07:14:46.725382 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jh7mt"] Feb 23 07:14:47 crc kubenswrapper[4626]: I0223 07:14:47.708322 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jh7mt" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="registry-server" containerID="cri-o://0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299" gracePeriod=2 Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.195068 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.368904 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9644g\" (UniqueName: \"kubernetes.io/projected/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-kube-api-access-9644g\") pod \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.369069 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-catalog-content\") pod \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.369121 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-utilities\") pod \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\" (UID: \"88043bc2-ae35-43fc-93cf-78b5e7c21e9b\") " Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.369770 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-utilities" (OuterVolumeSpecName: "utilities") pod "88043bc2-ae35-43fc-93cf-78b5e7c21e9b" (UID: "88043bc2-ae35-43fc-93cf-78b5e7c21e9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.374618 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-kube-api-access-9644g" (OuterVolumeSpecName: "kube-api-access-9644g") pod "88043bc2-ae35-43fc-93cf-78b5e7c21e9b" (UID: "88043bc2-ae35-43fc-93cf-78b5e7c21e9b"). InnerVolumeSpecName "kube-api-access-9644g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.471436 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.471469 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9644g\" (UniqueName: \"kubernetes.io/projected/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-kube-api-access-9644g\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.478623 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88043bc2-ae35-43fc-93cf-78b5e7c21e9b" (UID: "88043bc2-ae35-43fc-93cf-78b5e7c21e9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.574468 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88043bc2-ae35-43fc-93cf-78b5e7c21e9b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.728205 4626 generic.go:334] "Generic (PLEG): container finished" podID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerID="0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299" exitCode=0 Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.728268 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerDied","Data":"0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299"} Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.728292 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh7mt" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.728315 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh7mt" event={"ID":"88043bc2-ae35-43fc-93cf-78b5e7c21e9b","Type":"ContainerDied","Data":"6c8adae1783cf394b886d5c9f4e865087a12ff90cb76b57e49e8a5e6de89a940"} Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.728338 4626 scope.go:117] "RemoveContainer" containerID="0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.751161 4626 scope.go:117] "RemoveContainer" containerID="c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.768374 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jh7mt"] Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.781745 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jh7mt"] Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.788941 4626 scope.go:117] "RemoveContainer" containerID="d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.842869 4626 scope.go:117] "RemoveContainer" containerID="0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299" Feb 23 07:14:48 crc kubenswrapper[4626]: E0223 07:14:48.843552 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299\": container with ID starting with 0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299 not found: ID does not exist" containerID="0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.843588 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299"} err="failed to get container status \"0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299\": rpc error: code = NotFound desc = could not find container \"0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299\": container with ID starting with 0f7064f18f80617bd5b2f5fa415c9c2c9970bc071e5ed685685179a8c6410299 not found: ID does not exist" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.843627 4626 scope.go:117] "RemoveContainer" containerID="c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2" Feb 23 07:14:48 crc kubenswrapper[4626]: E0223 07:14:48.843990 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2\": container with ID starting with c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2 not found: ID does not exist" containerID="c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.844013 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2"} err="failed to get container status \"c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2\": rpc error: code = NotFound desc = could not find container \"c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2\": container with ID starting with c9c368fd97932ecbc8ed86e9eff20a67fdf4ac83a5fa3a0de7cae4768abb65c2 not found: ID does not exist" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.844030 4626 scope.go:117] "RemoveContainer" containerID="d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028" Feb 23 07:14:48 crc kubenswrapper[4626]: E0223 07:14:48.844635 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028\": container with ID starting with d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028 not found: ID does not exist" containerID="d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028" Feb 23 07:14:48 crc kubenswrapper[4626]: I0223 07:14:48.844663 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028"} err="failed to get container status \"d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028\": rpc error: code = NotFound desc = could not find container \"d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028\": container with ID starting with d5fc533cbee2ef00b778b19d338fc8b8e03853ddc37027b2caf2b4219c03c028 not found: ID does not exist" Feb 23 07:14:49 crc kubenswrapper[4626]: I0223 07:14:49.992351 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" path="/var/lib/kubelet/pods/88043bc2-ae35-43fc-93cf-78b5e7c21e9b/volumes" Feb 23 07:14:55 crc kubenswrapper[4626]: I0223 07:14:55.685826 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:14:55 crc kubenswrapper[4626]: I0223 07:14:55.687384 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.146646 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2"] Feb 23 07:15:00 crc kubenswrapper[4626]: E0223 07:15:00.147586 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="extract-utilities" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.147600 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="extract-utilities" Feb 23 07:15:00 crc kubenswrapper[4626]: E0223 07:15:00.147629 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.147636 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[4626]: E0223 07:15:00.147663 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="extract-content" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.147670 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="extract-content" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.147871 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="88043bc2-ae35-43fc-93cf-78b5e7c21e9b" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.148574 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.150570 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.151127 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.155381 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2"] Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.336339 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9brq2\" (UniqueName: \"kubernetes.io/projected/13c6d748-b917-4113-b932-b846717ebed2-kube-api-access-9brq2\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.336562 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c6d748-b917-4113-b932-b846717ebed2-secret-volume\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.336624 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c6d748-b917-4113-b932-b846717ebed2-config-volume\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.438208 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c6d748-b917-4113-b932-b846717ebed2-secret-volume\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.438256 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c6d748-b917-4113-b932-b846717ebed2-config-volume\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.438366 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9brq2\" (UniqueName: \"kubernetes.io/projected/13c6d748-b917-4113-b932-b846717ebed2-kube-api-access-9brq2\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.439247 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c6d748-b917-4113-b932-b846717ebed2-config-volume\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.446200 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c6d748-b917-4113-b932-b846717ebed2-secret-volume\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.458266 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9brq2\" (UniqueName: \"kubernetes.io/projected/13c6d748-b917-4113-b932-b846717ebed2-kube-api-access-9brq2\") pod \"collect-profiles-29530515-27ph2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.471967 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:00 crc kubenswrapper[4626]: I0223 07:15:00.891983 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2"] Feb 23 07:15:01 crc kubenswrapper[4626]: I0223 07:15:01.855448 4626 generic.go:334] "Generic (PLEG): container finished" podID="13c6d748-b917-4113-b932-b846717ebed2" containerID="f3093b7741acda3a958bef1ba5e379cef5a42a8b7232ca0827fe2c1dbe15d8d9" exitCode=0 Feb 23 07:15:01 crc kubenswrapper[4626]: I0223 07:15:01.855547 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" event={"ID":"13c6d748-b917-4113-b932-b846717ebed2","Type":"ContainerDied","Data":"f3093b7741acda3a958bef1ba5e379cef5a42a8b7232ca0827fe2c1dbe15d8d9"} Feb 23 07:15:01 crc kubenswrapper[4626]: I0223 07:15:01.856159 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" event={"ID":"13c6d748-b917-4113-b932-b846717ebed2","Type":"ContainerStarted","Data":"e3fb627fa4cbbe50545d9a7fa28c53bcd41b9419f3253b446a5f5e0c98beeea5"} Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.160172 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.218123 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c6d748-b917-4113-b932-b846717ebed2-config-volume\") pod \"13c6d748-b917-4113-b932-b846717ebed2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.218188 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9brq2\" (UniqueName: \"kubernetes.io/projected/13c6d748-b917-4113-b932-b846717ebed2-kube-api-access-9brq2\") pod \"13c6d748-b917-4113-b932-b846717ebed2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.218565 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c6d748-b917-4113-b932-b846717ebed2-secret-volume\") pod \"13c6d748-b917-4113-b932-b846717ebed2\" (UID: \"13c6d748-b917-4113-b932-b846717ebed2\") " Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.219918 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c6d748-b917-4113-b932-b846717ebed2-config-volume" (OuterVolumeSpecName: "config-volume") pod "13c6d748-b917-4113-b932-b846717ebed2" (UID: "13c6d748-b917-4113-b932-b846717ebed2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.238093 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c6d748-b917-4113-b932-b846717ebed2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13c6d748-b917-4113-b932-b846717ebed2" (UID: "13c6d748-b917-4113-b932-b846717ebed2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.238354 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c6d748-b917-4113-b932-b846717ebed2-kube-api-access-9brq2" (OuterVolumeSpecName: "kube-api-access-9brq2") pod "13c6d748-b917-4113-b932-b846717ebed2" (UID: "13c6d748-b917-4113-b932-b846717ebed2"). InnerVolumeSpecName "kube-api-access-9brq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.322055 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c6d748-b917-4113-b932-b846717ebed2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.322110 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c6d748-b917-4113-b932-b846717ebed2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.322125 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9brq2\" (UniqueName: \"kubernetes.io/projected/13c6d748-b917-4113-b932-b846717ebed2-kube-api-access-9brq2\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.874258 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" event={"ID":"13c6d748-b917-4113-b932-b846717ebed2","Type":"ContainerDied","Data":"e3fb627fa4cbbe50545d9a7fa28c53bcd41b9419f3253b446a5f5e0c98beeea5"} Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.874650 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3fb627fa4cbbe50545d9a7fa28c53bcd41b9419f3253b446a5f5e0c98beeea5" Feb 23 07:15:03 crc kubenswrapper[4626]: I0223 07:15:03.874347 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2" Feb 23 07:15:04 crc kubenswrapper[4626]: I0223 07:15:04.242880 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj"] Feb 23 07:15:04 crc kubenswrapper[4626]: I0223 07:15:04.249002 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530470-ksbmj"] Feb 23 07:15:06 crc kubenswrapper[4626]: I0223 07:15:06.027071 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de9fe51-8926-4966-85f1-b14c16db8a74" path="/var/lib/kubelet/pods/7de9fe51-8926-4966-85f1-b14c16db8a74/volumes" Feb 23 07:15:06 crc kubenswrapper[4626]: I0223 07:15:06.902387 4626 generic.go:334] "Generic (PLEG): container finished" podID="671613e8-e8c1-40e7-86bf-026acd3864fe" containerID="ea11f8d2a78c736cdef1285be7f546b63a0a56de8a4660b14603f9af2ae055bd" exitCode=0 Feb 23 07:15:06 crc kubenswrapper[4626]: I0223 07:15:06.902474 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" event={"ID":"671613e8-e8c1-40e7-86bf-026acd3864fe","Type":"ContainerDied","Data":"ea11f8d2a78c736cdef1285be7f546b63a0a56de8a4660b14603f9af2ae055bd"} Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.291654 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.323711 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-ssh-key-openstack-edpm-ipam\") pod \"671613e8-e8c1-40e7-86bf-026acd3864fe\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.323999 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8swxh\" (UniqueName: \"kubernetes.io/projected/671613e8-e8c1-40e7-86bf-026acd3864fe-kube-api-access-8swxh\") pod \"671613e8-e8c1-40e7-86bf-026acd3864fe\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.324110 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-inventory\") pod \"671613e8-e8c1-40e7-86bf-026acd3864fe\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.324209 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"671613e8-e8c1-40e7-86bf-026acd3864fe\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.324234 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-metadata-combined-ca-bundle\") pod \"671613e8-e8c1-40e7-86bf-026acd3864fe\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.324460 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-nova-metadata-neutron-config-0\") pod \"671613e8-e8c1-40e7-86bf-026acd3864fe\" (UID: \"671613e8-e8c1-40e7-86bf-026acd3864fe\") " Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.331979 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "671613e8-e8c1-40e7-86bf-026acd3864fe" (UID: "671613e8-e8c1-40e7-86bf-026acd3864fe"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.333095 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671613e8-e8c1-40e7-86bf-026acd3864fe-kube-api-access-8swxh" (OuterVolumeSpecName: "kube-api-access-8swxh") pod "671613e8-e8c1-40e7-86bf-026acd3864fe" (UID: "671613e8-e8c1-40e7-86bf-026acd3864fe"). InnerVolumeSpecName "kube-api-access-8swxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.351929 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "671613e8-e8c1-40e7-86bf-026acd3864fe" (UID: "671613e8-e8c1-40e7-86bf-026acd3864fe"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.354557 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-inventory" (OuterVolumeSpecName: "inventory") pod "671613e8-e8c1-40e7-86bf-026acd3864fe" (UID: "671613e8-e8c1-40e7-86bf-026acd3864fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.359080 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "671613e8-e8c1-40e7-86bf-026acd3864fe" (UID: "671613e8-e8c1-40e7-86bf-026acd3864fe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.365388 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "671613e8-e8c1-40e7-86bf-026acd3864fe" (UID: "671613e8-e8c1-40e7-86bf-026acd3864fe"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.428114 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8swxh\" (UniqueName: \"kubernetes.io/projected/671613e8-e8c1-40e7-86bf-026acd3864fe-kube-api-access-8swxh\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.428517 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.428594 4626 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.428662 4626 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.428724 4626 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.428955 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/671613e8-e8c1-40e7-86bf-026acd3864fe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.924400 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" event={"ID":"671613e8-e8c1-40e7-86bf-026acd3864fe","Type":"ContainerDied","Data":"eff9e26ba0c1476f73a73703c70df5870de6a3b1acb401972bd595989ed1e58c"} Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.924476 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff9e26ba0c1476f73a73703c70df5870de6a3b1acb401972bd595989ed1e58c" Feb 23 07:15:08 crc kubenswrapper[4626]: I0223 07:15:08.924484 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.017849 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs"] Feb 23 07:15:09 crc kubenswrapper[4626]: E0223 07:15:09.018734 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671613e8-e8c1-40e7-86bf-026acd3864fe" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.018762 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="671613e8-e8c1-40e7-86bf-026acd3864fe" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 23 07:15:09 crc kubenswrapper[4626]: E0223 07:15:09.018782 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c6d748-b917-4113-b932-b846717ebed2" containerName="collect-profiles" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.018791 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c6d748-b917-4113-b932-b846717ebed2" containerName="collect-profiles" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.019130 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="671613e8-e8c1-40e7-86bf-026acd3864fe" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.019149 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c6d748-b917-4113-b932-b846717ebed2" containerName="collect-profiles" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.020206 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.022319 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.022819 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.022894 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.023070 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.023582 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.041594 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs"] Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.145213 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.145354 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gj6x\" (UniqueName: \"kubernetes.io/projected/a9f59db3-8e35-432d-9dc1-bf70b5de9990-kube-api-access-5gj6x\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.145626 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.145737 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.146082 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.247672 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.247826 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.247862 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gj6x\" (UniqueName: \"kubernetes.io/projected/a9f59db3-8e35-432d-9dc1-bf70b5de9990-kube-api-access-5gj6x\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.248693 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.248854 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.253680 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.255370 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.257826 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.259771 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.264012 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gj6x\" (UniqueName: \"kubernetes.io/projected/a9f59db3-8e35-432d-9dc1-bf70b5de9990-kube-api-access-5gj6x\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.363165 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.859396 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs"] Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.867976 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:15:09 crc kubenswrapper[4626]: I0223 07:15:09.938138 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" event={"ID":"a9f59db3-8e35-432d-9dc1-bf70b5de9990","Type":"ContainerStarted","Data":"8562ee403bb1f4c5456722a2a40e36b384d25d4584c527e5ba4c61037d49f737"} Feb 23 07:15:10 crc kubenswrapper[4626]: I0223 07:15:10.950780 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" event={"ID":"a9f59db3-8e35-432d-9dc1-bf70b5de9990","Type":"ContainerStarted","Data":"701d365d6bf6521ed25505b41878a543f2f4cae32196ef4c243250a6d5fff7db"} Feb 23 07:15:10 crc kubenswrapper[4626]: I0223 07:15:10.963661 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" podStartSLOduration=2.399311542 podStartE2EDuration="2.963645676s" podCreationTimestamp="2026-02-23 07:15:08 +0000 UTC" firstStartedPulling="2026-02-23 07:15:09.867655652 +0000 UTC m=+2062.206984908" lastFinishedPulling="2026-02-23 07:15:10.431989776 +0000 UTC m=+2062.771319042" observedRunningTime="2026-02-23 07:15:10.961851713 +0000 UTC m=+2063.301180980" watchObservedRunningTime="2026-02-23 07:15:10.963645676 +0000 UTC m=+2063.302974942" Feb 23 07:15:20 crc kubenswrapper[4626]: I0223 07:15:20.979684 4626 scope.go:117] "RemoveContainer" containerID="86a087b59bd61ad4432a7d8990523be0a09a3135643c8e4a561d95d22cafe9ab" Feb 23 07:15:25 crc kubenswrapper[4626]: I0223 07:15:25.685373 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:15:25 crc kubenswrapper[4626]: I0223 07:15:25.686159 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:15:55 crc kubenswrapper[4626]: I0223 07:15:55.685292 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:15:55 crc kubenswrapper[4626]: I0223 07:15:55.686078 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:15:55 crc kubenswrapper[4626]: I0223 07:15:55.686152 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:15:55 crc kubenswrapper[4626]: I0223 07:15:55.687562 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"522c533173b6dc8610119c7e6504c043a2bd72039bef9f6e0109c726475aba01"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:15:55 crc kubenswrapper[4626]: I0223 07:15:55.687627 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://522c533173b6dc8610119c7e6504c043a2bd72039bef9f6e0109c726475aba01" gracePeriod=600 Feb 23 07:15:56 crc kubenswrapper[4626]: I0223 07:15:56.409268 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="522c533173b6dc8610119c7e6504c043a2bd72039bef9f6e0109c726475aba01" exitCode=0 Feb 23 07:15:56 crc kubenswrapper[4626]: I0223 07:15:56.409343 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"522c533173b6dc8610119c7e6504c043a2bd72039bef9f6e0109c726475aba01"} Feb 23 07:15:56 crc kubenswrapper[4626]: I0223 07:15:56.409635 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10"} Feb 23 07:15:56 crc kubenswrapper[4626]: I0223 07:15:56.409673 4626 scope.go:117] "RemoveContainer" containerID="f30f72a645d2155a5ae47d4a54854c000cd9d5a2401119b45e6f014c89c6f5ac" Feb 23 07:17:09 crc kubenswrapper[4626]: I0223 07:17:09.863883 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8bgms"] Feb 23 07:17:09 crc kubenswrapper[4626]: I0223 07:17:09.867291 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:09 crc kubenswrapper[4626]: I0223 07:17:09.882799 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bgms"] Feb 23 07:17:09 crc kubenswrapper[4626]: I0223 07:17:09.955105 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-catalog-content\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:09 crc kubenswrapper[4626]: I0223 07:17:09.955202 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-utilities\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:09 crc kubenswrapper[4626]: I0223 07:17:09.955306 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8ph\" (UniqueName: \"kubernetes.io/projected/c1736a88-e788-4cbe-a7fd-f1e80331446f-kube-api-access-6q8ph\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.057267 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-catalog-content\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.057325 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-utilities\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.057361 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8ph\" (UniqueName: \"kubernetes.io/projected/c1736a88-e788-4cbe-a7fd-f1e80331446f-kube-api-access-6q8ph\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.057802 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-catalog-content\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.057974 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-utilities\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.061007 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gljjt"] Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.062728 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.080809 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8ph\" (UniqueName: \"kubernetes.io/projected/c1736a88-e788-4cbe-a7fd-f1e80331446f-kube-api-access-6q8ph\") pod \"redhat-marketplace-8bgms\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.094507 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gljjt"] Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.159292 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-utilities\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.159701 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5vv8\" (UniqueName: \"kubernetes.io/projected/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-kube-api-access-v5vv8\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.159850 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-catalog-content\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.193327 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.261734 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-utilities\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.261894 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5vv8\" (UniqueName: \"kubernetes.io/projected/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-kube-api-access-v5vv8\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.261922 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-catalog-content\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.262323 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-catalog-content\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.262477 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-utilities\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.294694 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5vv8\" (UniqueName: \"kubernetes.io/projected/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-kube-api-access-v5vv8\") pod \"community-operators-gljjt\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.378786 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.714555 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bgms"] Feb 23 07:17:10 crc kubenswrapper[4626]: I0223 07:17:10.731542 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gljjt"] Feb 23 07:17:11 crc kubenswrapper[4626]: I0223 07:17:11.153591 4626 generic.go:334] "Generic (PLEG): container finished" podID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerID="2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd" exitCode=0 Feb 23 07:17:11 crc kubenswrapper[4626]: I0223 07:17:11.154163 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerDied","Data":"2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd"} Feb 23 07:17:11 crc kubenswrapper[4626]: I0223 07:17:11.154204 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerStarted","Data":"f7e20bf28e11b76d1e9f11ca91baa422168c03636ac807d81a46c7c1c467d67d"} Feb 23 07:17:11 crc kubenswrapper[4626]: I0223 07:17:11.161243 4626 generic.go:334] "Generic (PLEG): container finished" podID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerID="aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777" exitCode=0 Feb 23 07:17:11 crc kubenswrapper[4626]: I0223 07:17:11.161320 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerDied","Data":"aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777"} Feb 23 07:17:11 crc kubenswrapper[4626]: I0223 07:17:11.161382 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerStarted","Data":"338aa257e3a27569cdf12d3bea001e1c25ca08363e48aaa4334b549342fe39cb"} Feb 23 07:17:12 crc kubenswrapper[4626]: I0223 07:17:12.174771 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerStarted","Data":"ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b"} Feb 23 07:17:12 crc kubenswrapper[4626]: I0223 07:17:12.177572 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerStarted","Data":"7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf"} Feb 23 07:17:14 crc kubenswrapper[4626]: I0223 07:17:14.210117 4626 generic.go:334] "Generic (PLEG): container finished" podID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerID="7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf" exitCode=0 Feb 23 07:17:14 crc kubenswrapper[4626]: I0223 07:17:14.210198 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerDied","Data":"7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf"} Feb 23 07:17:14 crc kubenswrapper[4626]: I0223 07:17:14.214252 4626 generic.go:334] "Generic (PLEG): container finished" podID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerID="ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b" exitCode=0 Feb 23 07:17:14 crc kubenswrapper[4626]: I0223 07:17:14.214324 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerDied","Data":"ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b"} Feb 23 07:17:15 crc kubenswrapper[4626]: I0223 07:17:15.247894 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerStarted","Data":"43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b"} Feb 23 07:17:15 crc kubenswrapper[4626]: I0223 07:17:15.252786 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerStarted","Data":"fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80"} Feb 23 07:17:15 crc kubenswrapper[4626]: I0223 07:17:15.275277 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8bgms" podStartSLOduration=2.718972313 podStartE2EDuration="6.275255418s" podCreationTimestamp="2026-02-23 07:17:09 +0000 UTC" firstStartedPulling="2026-02-23 07:17:11.157411307 +0000 UTC m=+2183.496740573" lastFinishedPulling="2026-02-23 07:17:14.713694413 +0000 UTC m=+2187.053023678" observedRunningTime="2026-02-23 07:17:15.268751978 +0000 UTC m=+2187.608081244" watchObservedRunningTime="2026-02-23 07:17:15.275255418 +0000 UTC m=+2187.614584685" Feb 23 07:17:15 crc kubenswrapper[4626]: I0223 07:17:15.295847 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gljjt" podStartSLOduration=1.744365677 podStartE2EDuration="5.295834015s" podCreationTimestamp="2026-02-23 07:17:10 +0000 UTC" firstStartedPulling="2026-02-23 07:17:11.163347488 +0000 UTC m=+2183.502676755" lastFinishedPulling="2026-02-23 07:17:14.714815836 +0000 UTC m=+2187.054145093" observedRunningTime="2026-02-23 07:17:15.294890146 +0000 UTC m=+2187.634219412" watchObservedRunningTime="2026-02-23 07:17:15.295834015 +0000 UTC m=+2187.635163282" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.193537 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.194216 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.232627 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.342918 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.380135 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.380417 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.421766 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:20 crc kubenswrapper[4626]: I0223 07:17:20.851265 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bgms"] Feb 23 07:17:21 crc kubenswrapper[4626]: I0223 07:17:21.351388 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.326111 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8bgms" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="registry-server" containerID="cri-o://43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b" gracePeriod=2 Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.652265 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gljjt"] Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.680881 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.755645 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-utilities\") pod \"c1736a88-e788-4cbe-a7fd-f1e80331446f\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.755840 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-catalog-content\") pod \"c1736a88-e788-4cbe-a7fd-f1e80331446f\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.755883 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q8ph\" (UniqueName: \"kubernetes.io/projected/c1736a88-e788-4cbe-a7fd-f1e80331446f-kube-api-access-6q8ph\") pod \"c1736a88-e788-4cbe-a7fd-f1e80331446f\" (UID: \"c1736a88-e788-4cbe-a7fd-f1e80331446f\") " Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.756553 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-utilities" (OuterVolumeSpecName: "utilities") pod "c1736a88-e788-4cbe-a7fd-f1e80331446f" (UID: "c1736a88-e788-4cbe-a7fd-f1e80331446f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.756892 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.766840 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1736a88-e788-4cbe-a7fd-f1e80331446f-kube-api-access-6q8ph" (OuterVolumeSpecName: "kube-api-access-6q8ph") pod "c1736a88-e788-4cbe-a7fd-f1e80331446f" (UID: "c1736a88-e788-4cbe-a7fd-f1e80331446f"). InnerVolumeSpecName "kube-api-access-6q8ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.775969 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1736a88-e788-4cbe-a7fd-f1e80331446f" (UID: "c1736a88-e788-4cbe-a7fd-f1e80331446f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.858986 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1736a88-e788-4cbe-a7fd-f1e80331446f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:17:22 crc kubenswrapper[4626]: I0223 07:17:22.859019 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q8ph\" (UniqueName: \"kubernetes.io/projected/c1736a88-e788-4cbe-a7fd-f1e80331446f-kube-api-access-6q8ph\") on node \"crc\" DevicePath \"\"" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.339290 4626 generic.go:334] "Generic (PLEG): container finished" podID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerID="43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b" exitCode=0 Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.339373 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerDied","Data":"43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b"} Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.339739 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bgms" event={"ID":"c1736a88-e788-4cbe-a7fd-f1e80331446f","Type":"ContainerDied","Data":"f7e20bf28e11b76d1e9f11ca91baa422168c03636ac807d81a46c7c1c467d67d"} Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.339774 4626 scope.go:117] "RemoveContainer" containerID="43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.339407 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bgms" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.340391 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gljjt" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="registry-server" containerID="cri-o://fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80" gracePeriod=2 Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.374330 4626 scope.go:117] "RemoveContainer" containerID="ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.419554 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bgms"] Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.432687 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bgms"] Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.450288 4626 scope.go:117] "RemoveContainer" containerID="2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.548929 4626 scope.go:117] "RemoveContainer" containerID="43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b" Feb 23 07:17:23 crc kubenswrapper[4626]: E0223 07:17:23.549685 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b\": container with ID starting with 43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b not found: ID does not exist" containerID="43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.549727 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b"} err="failed to get container status \"43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b\": rpc error: code = NotFound desc = could not find container \"43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b\": container with ID starting with 43157c2041411ff121c5053d1a64772fba83636e01ffb1cc12a396de885a561b not found: ID does not exist" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.549756 4626 scope.go:117] "RemoveContainer" containerID="ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b" Feb 23 07:17:23 crc kubenswrapper[4626]: E0223 07:17:23.550150 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b\": container with ID starting with ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b not found: ID does not exist" containerID="ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.550174 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b"} err="failed to get container status \"ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b\": rpc error: code = NotFound desc = could not find container \"ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b\": container with ID starting with ccf032aef90700bfa9596f8814b1179e695324fa9431b8bc039db5e40e4e142b not found: ID does not exist" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.550190 4626 scope.go:117] "RemoveContainer" containerID="2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd" Feb 23 07:17:23 crc kubenswrapper[4626]: E0223 07:17:23.551395 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd\": container with ID starting with 2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd not found: ID does not exist" containerID="2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.551432 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd"} err="failed to get container status \"2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd\": rpc error: code = NotFound desc = could not find container \"2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd\": container with ID starting with 2f8788883992a6c2b9db7684a4b3f31b7836a460a2acd4cb8a85b4fa8427bdbd not found: ID does not exist" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.883819 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.984561 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5vv8\" (UniqueName: \"kubernetes.io/projected/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-kube-api-access-v5vv8\") pod \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.984842 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-catalog-content\") pod \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.984913 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-utilities\") pod \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\" (UID: \"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6\") " Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.985460 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-utilities" (OuterVolumeSpecName: "utilities") pod "e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" (UID: "e8f0e14d-4e26-4355-9e25-7172e2f5e2b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:17:23 crc kubenswrapper[4626]: I0223 07:17:23.986419 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:23.992387 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-kube-api-access-v5vv8" (OuterVolumeSpecName: "kube-api-access-v5vv8") pod "e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" (UID: "e8f0e14d-4e26-4355-9e25-7172e2f5e2b6"). InnerVolumeSpecName "kube-api-access-v5vv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:23.994230 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" path="/var/lib/kubelet/pods/c1736a88-e788-4cbe-a7fd-f1e80331446f/volumes" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.029647 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" (UID: "e8f0e14d-4e26-4355-9e25-7172e2f5e2b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.088572 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.088597 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5vv8\" (UniqueName: \"kubernetes.io/projected/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6-kube-api-access-v5vv8\") on node \"crc\" DevicePath \"\"" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.353310 4626 generic.go:334] "Generic (PLEG): container finished" podID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerID="fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80" exitCode=0 Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.353381 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gljjt" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.353415 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerDied","Data":"fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80"} Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.353457 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gljjt" event={"ID":"e8f0e14d-4e26-4355-9e25-7172e2f5e2b6","Type":"ContainerDied","Data":"338aa257e3a27569cdf12d3bea001e1c25ca08363e48aaa4334b549342fe39cb"} Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.353478 4626 scope.go:117] "RemoveContainer" containerID="fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.380772 4626 scope.go:117] "RemoveContainer" containerID="7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.388950 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gljjt"] Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.398764 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gljjt"] Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.411347 4626 scope.go:117] "RemoveContainer" containerID="aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.429035 4626 scope.go:117] "RemoveContainer" containerID="fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80" Feb 23 07:17:24 crc kubenswrapper[4626]: E0223 07:17:24.429607 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80\": container with ID starting with fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80 not found: ID does not exist" containerID="fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.429652 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80"} err="failed to get container status \"fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80\": rpc error: code = NotFound desc = could not find container \"fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80\": container with ID starting with fe3965ec961e172bc4f243a1afb6fa9cbd6c2433e3413a38d23fe8a82da6bc80 not found: ID does not exist" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.429701 4626 scope.go:117] "RemoveContainer" containerID="7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf" Feb 23 07:17:24 crc kubenswrapper[4626]: E0223 07:17:24.430057 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf\": container with ID starting with 7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf not found: ID does not exist" containerID="7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.430146 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf"} err="failed to get container status \"7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf\": rpc error: code = NotFound desc = could not find container \"7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf\": container with ID starting with 7b6c8cab801f8ac7a6717f7520cd457990d565afadf0a88f06c76e42497a11bf not found: ID does not exist" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.430216 4626 scope.go:117] "RemoveContainer" containerID="aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777" Feb 23 07:17:24 crc kubenswrapper[4626]: E0223 07:17:24.430727 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777\": container with ID starting with aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777 not found: ID does not exist" containerID="aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777" Feb 23 07:17:24 crc kubenswrapper[4626]: I0223 07:17:24.430773 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777"} err="failed to get container status \"aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777\": rpc error: code = NotFound desc = could not find container \"aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777\": container with ID starting with aafb6125ed91e78e5aaf19fe79b7bf493082d2ddfe12bc690050ba2110671777 not found: ID does not exist" Feb 23 07:17:25 crc kubenswrapper[4626]: I0223 07:17:25.995390 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" path="/var/lib/kubelet/pods/e8f0e14d-4e26-4355-9e25-7172e2f5e2b6/volumes" Feb 23 07:18:08 crc kubenswrapper[4626]: I0223 07:18:08.838719 4626 generic.go:334] "Generic (PLEG): container finished" podID="a9f59db3-8e35-432d-9dc1-bf70b5de9990" containerID="701d365d6bf6521ed25505b41878a543f2f4cae32196ef4c243250a6d5fff7db" exitCode=0 Feb 23 07:18:08 crc kubenswrapper[4626]: I0223 07:18:08.838809 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" event={"ID":"a9f59db3-8e35-432d-9dc1-bf70b5de9990","Type":"ContainerDied","Data":"701d365d6bf6521ed25505b41878a543f2f4cae32196ef4c243250a6d5fff7db"} Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.271814 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.331119 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-secret-0\") pod \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.331290 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gj6x\" (UniqueName: \"kubernetes.io/projected/a9f59db3-8e35-432d-9dc1-bf70b5de9990-kube-api-access-5gj6x\") pod \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.331341 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-combined-ca-bundle\") pod \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.331645 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-ssh-key-openstack-edpm-ipam\") pod \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.331675 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-inventory\") pod \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\" (UID: \"a9f59db3-8e35-432d-9dc1-bf70b5de9990\") " Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.343812 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f59db3-8e35-432d-9dc1-bf70b5de9990-kube-api-access-5gj6x" (OuterVolumeSpecName: "kube-api-access-5gj6x") pod "a9f59db3-8e35-432d-9dc1-bf70b5de9990" (UID: "a9f59db3-8e35-432d-9dc1-bf70b5de9990"). InnerVolumeSpecName "kube-api-access-5gj6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.347516 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a9f59db3-8e35-432d-9dc1-bf70b5de9990" (UID: "a9f59db3-8e35-432d-9dc1-bf70b5de9990"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.363415 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-inventory" (OuterVolumeSpecName: "inventory") pod "a9f59db3-8e35-432d-9dc1-bf70b5de9990" (UID: "a9f59db3-8e35-432d-9dc1-bf70b5de9990"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.365935 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a9f59db3-8e35-432d-9dc1-bf70b5de9990" (UID: "a9f59db3-8e35-432d-9dc1-bf70b5de9990"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.383915 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a9f59db3-8e35-432d-9dc1-bf70b5de9990" (UID: "a9f59db3-8e35-432d-9dc1-bf70b5de9990"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.435537 4626 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.435579 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.435594 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.435604 4626 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a9f59db3-8e35-432d-9dc1-bf70b5de9990-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.435615 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gj6x\" (UniqueName: \"kubernetes.io/projected/a9f59db3-8e35-432d-9dc1-bf70b5de9990-kube-api-access-5gj6x\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.860654 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" event={"ID":"a9f59db3-8e35-432d-9dc1-bf70b5de9990","Type":"ContainerDied","Data":"8562ee403bb1f4c5456722a2a40e36b384d25d4584c527e5ba4c61037d49f737"} Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.860961 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8562ee403bb1f4c5456722a2a40e36b384d25d4584c527e5ba4c61037d49f737" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.860749 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.944704 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm"] Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945173 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="extract-content" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945205 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="extract-content" Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945230 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="extract-utilities" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945237 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="extract-utilities" Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945244 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="extract-utilities" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945251 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="extract-utilities" Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945262 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="registry-server" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945267 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="registry-server" Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945280 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f59db3-8e35-432d-9dc1-bf70b5de9990" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945286 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f59db3-8e35-432d-9dc1-bf70b5de9990" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945308 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="extract-content" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945314 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="extract-content" Feb 23 07:18:10 crc kubenswrapper[4626]: E0223 07:18:10.945328 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="registry-server" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945333 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="registry-server" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945559 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f59db3-8e35-432d-9dc1-bf70b5de9990" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945585 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f0e14d-4e26-4355-9e25-7172e2f5e2b6" containerName="registry-server" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.945605 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1736a88-e788-4cbe-a7fd-f1e80331446f" containerName="registry-server" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.946307 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.955098 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.955265 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.955387 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.955511 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.955633 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.955744 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.968215 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm"] Feb 23 07:18:10 crc kubenswrapper[4626]: I0223 07:18:10.974982 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.053148 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.053892 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054056 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2khxp\" (UniqueName: \"kubernetes.io/projected/ba805f29-0d45-499e-bc08-00188c51379f-kube-api-access-2khxp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054284 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054452 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054621 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054655 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054700 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ba805f29-0d45-499e-bc08-00188c51379f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054729 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.054753 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.055021 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157617 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157685 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157740 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ba805f29-0d45-499e-bc08-00188c51379f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157769 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157799 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157875 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.157972 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.158058 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.158089 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2khxp\" (UniqueName: \"kubernetes.io/projected/ba805f29-0d45-499e-bc08-00188c51379f-kube-api-access-2khxp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.158224 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.158369 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.159348 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ba805f29-0d45-499e-bc08-00188c51379f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.165195 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.165693 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.165910 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.166708 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.168919 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.170871 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.170959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.171965 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.176158 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.178237 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2khxp\" (UniqueName: \"kubernetes.io/projected/ba805f29-0d45-499e-bc08-00188c51379f-kube-api-access-2khxp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7wfjm\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.278566 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.831277 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm"] Feb 23 07:18:11 crc kubenswrapper[4626]: I0223 07:18:11.870552 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" event={"ID":"ba805f29-0d45-499e-bc08-00188c51379f","Type":"ContainerStarted","Data":"4f0385dd2217d874064f046aed3067fb09344e9c0a215efb20f8a4bbced60ba4"} Feb 23 07:18:12 crc kubenswrapper[4626]: I0223 07:18:12.882261 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" event={"ID":"ba805f29-0d45-499e-bc08-00188c51379f","Type":"ContainerStarted","Data":"b2a2d52bfe64ff5f1c26f94313a8e4d68b2aa352d34e655cd0ecb1a144f176f4"} Feb 23 07:18:12 crc kubenswrapper[4626]: I0223 07:18:12.898749 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" podStartSLOduration=2.228877601 podStartE2EDuration="2.89872321s" podCreationTimestamp="2026-02-23 07:18:10 +0000 UTC" firstStartedPulling="2026-02-23 07:18:11.846334286 +0000 UTC m=+2244.185663552" lastFinishedPulling="2026-02-23 07:18:12.516179895 +0000 UTC m=+2244.855509161" observedRunningTime="2026-02-23 07:18:12.897125598 +0000 UTC m=+2245.236454864" watchObservedRunningTime="2026-02-23 07:18:12.89872321 +0000 UTC m=+2245.238052467" Feb 23 07:18:25 crc kubenswrapper[4626]: I0223 07:18:25.685758 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:18:25 crc kubenswrapper[4626]: I0223 07:18:25.686624 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:18:55 crc kubenswrapper[4626]: I0223 07:18:55.685187 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:18:55 crc kubenswrapper[4626]: I0223 07:18:55.685604 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:19:25 crc kubenswrapper[4626]: I0223 07:19:25.685339 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:19:25 crc kubenswrapper[4626]: I0223 07:19:25.685808 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:19:25 crc kubenswrapper[4626]: I0223 07:19:25.685863 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:19:25 crc kubenswrapper[4626]: I0223 07:19:25.687106 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:19:25 crc kubenswrapper[4626]: I0223 07:19:25.687164 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" gracePeriod=600 Feb 23 07:19:25 crc kubenswrapper[4626]: E0223 07:19:25.811253 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:19:26 crc kubenswrapper[4626]: I0223 07:19:26.571292 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" exitCode=0 Feb 23 07:19:26 crc kubenswrapper[4626]: I0223 07:19:26.571349 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10"} Feb 23 07:19:26 crc kubenswrapper[4626]: I0223 07:19:26.571402 4626 scope.go:117] "RemoveContainer" containerID="522c533173b6dc8610119c7e6504c043a2bd72039bef9f6e0109c726475aba01" Feb 23 07:19:26 crc kubenswrapper[4626]: I0223 07:19:26.572569 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:19:26 crc kubenswrapper[4626]: E0223 07:19:26.573134 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:19:38 crc kubenswrapper[4626]: I0223 07:19:38.982596 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:19:38 crc kubenswrapper[4626]: E0223 07:19:38.983454 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:19:50 crc kubenswrapper[4626]: I0223 07:19:50.983069 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:19:50 crc kubenswrapper[4626]: E0223 07:19:50.984403 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:20:05 crc kubenswrapper[4626]: I0223 07:20:05.982597 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:20:05 crc kubenswrapper[4626]: E0223 07:20:05.983510 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:20:09 crc kubenswrapper[4626]: I0223 07:20:09.969492 4626 generic.go:334] "Generic (PLEG): container finished" podID="ba805f29-0d45-499e-bc08-00188c51379f" containerID="b2a2d52bfe64ff5f1c26f94313a8e4d68b2aa352d34e655cd0ecb1a144f176f4" exitCode=0 Feb 23 07:20:09 crc kubenswrapper[4626]: I0223 07:20:09.969551 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" event={"ID":"ba805f29-0d45-499e-bc08-00188c51379f","Type":"ContainerDied","Data":"b2a2d52bfe64ff5f1c26f94313a8e4d68b2aa352d34e655cd0ecb1a144f176f4"} Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.358193 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.460428 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ba805f29-0d45-499e-bc08-00188c51379f-nova-extra-config-0\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.460524 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-0\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.460550 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-0\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.460578 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2khxp\" (UniqueName: \"kubernetes.io/projected/ba805f29-0d45-499e-bc08-00188c51379f-kube-api-access-2khxp\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461298 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-2\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461414 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-ssh-key-openstack-edpm-ipam\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461465 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-3\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461535 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-1\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461562 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-combined-ca-bundle\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461637 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-inventory\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.461772 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-1\") pod \"ba805f29-0d45-499e-bc08-00188c51379f\" (UID: \"ba805f29-0d45-499e-bc08-00188c51379f\") " Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.469145 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba805f29-0d45-499e-bc08-00188c51379f-kube-api-access-2khxp" (OuterVolumeSpecName: "kube-api-access-2khxp") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "kube-api-access-2khxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.475733 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.489481 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba805f29-0d45-499e-bc08-00188c51379f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.495938 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.502094 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.503418 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.511201 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.514290 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.525862 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-inventory" (OuterVolumeSpecName: "inventory") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.528293 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.530908 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ba805f29-0d45-499e-bc08-00188c51379f" (UID: "ba805f29-0d45-499e-bc08-00188c51379f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566160 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566263 4626 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566342 4626 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ba805f29-0d45-499e-bc08-00188c51379f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566396 4626 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566459 4626 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566538 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2khxp\" (UniqueName: \"kubernetes.io/projected/ba805f29-0d45-499e-bc08-00188c51379f-kube-api-access-2khxp\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566592 4626 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566654 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566705 4626 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566766 4626 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.566825 4626 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba805f29-0d45-499e-bc08-00188c51379f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.991953 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.993417 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7wfjm" event={"ID":"ba805f29-0d45-499e-bc08-00188c51379f","Type":"ContainerDied","Data":"4f0385dd2217d874064f046aed3067fb09344e9c0a215efb20f8a4bbced60ba4"} Feb 23 07:20:11 crc kubenswrapper[4626]: I0223 07:20:11.993461 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f0385dd2217d874064f046aed3067fb09344e9c0a215efb20f8a4bbced60ba4" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.078679 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr"] Feb 23 07:20:12 crc kubenswrapper[4626]: E0223 07:20:12.079278 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba805f29-0d45-499e-bc08-00188c51379f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.079300 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba805f29-0d45-499e-bc08-00188c51379f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.079552 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba805f29-0d45-499e-bc08-00188c51379f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.080303 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.082882 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vw4h6" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.083248 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.083193 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.083206 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.086442 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.087139 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr"] Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.184155 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.184319 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.184381 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.184414 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.184546 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.184789 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bm54\" (UniqueName: \"kubernetes.io/projected/a8761a5a-7aba-46e2-9070-49cc7e866c7b-kube-api-access-8bm54\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.185131 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287670 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287737 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287780 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287825 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287885 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bm54\" (UniqueName: \"kubernetes.io/projected/a8761a5a-7aba-46e2-9070-49cc7e866c7b-kube-api-access-8bm54\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287929 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.287987 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.294426 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.294695 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.295979 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.296217 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.299042 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.299084 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.309323 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bm54\" (UniqueName: \"kubernetes.io/projected/a8761a5a-7aba-46e2-9070-49cc7e866c7b-kube-api-access-8bm54\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.395655 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.891992 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr"] Feb 23 07:20:12 crc kubenswrapper[4626]: I0223 07:20:12.900493 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:20:13 crc kubenswrapper[4626]: I0223 07:20:13.002858 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" event={"ID":"a8761a5a-7aba-46e2-9070-49cc7e866c7b","Type":"ContainerStarted","Data":"8f560dc098baead995ec70ed7fc7f319a6e5d5830762edec9dd390456ae77cec"} Feb 23 07:20:14 crc kubenswrapper[4626]: I0223 07:20:14.012722 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" event={"ID":"a8761a5a-7aba-46e2-9070-49cc7e866c7b","Type":"ContainerStarted","Data":"897f1501f0166a13aaade9baa156ae8e3a10f969ed81447aab8b47a1b91704db"} Feb 23 07:20:14 crc kubenswrapper[4626]: I0223 07:20:14.035740 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" podStartSLOduration=1.45934937 podStartE2EDuration="2.035717032s" podCreationTimestamp="2026-02-23 07:20:12 +0000 UTC" firstStartedPulling="2026-02-23 07:20:12.900240539 +0000 UTC m=+2365.239569805" lastFinishedPulling="2026-02-23 07:20:13.4766082 +0000 UTC m=+2365.815937467" observedRunningTime="2026-02-23 07:20:14.035373475 +0000 UTC m=+2366.374702741" watchObservedRunningTime="2026-02-23 07:20:14.035717032 +0000 UTC m=+2366.375046289" Feb 23 07:20:17 crc kubenswrapper[4626]: I0223 07:20:17.988974 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:20:17 crc kubenswrapper[4626]: E0223 07:20:17.989789 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:20:31 crc kubenswrapper[4626]: I0223 07:20:31.981825 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:20:31 crc kubenswrapper[4626]: E0223 07:20:31.982635 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:20:45 crc kubenswrapper[4626]: I0223 07:20:45.982541 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:20:45 crc kubenswrapper[4626]: E0223 07:20:45.983795 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:20:56 crc kubenswrapper[4626]: I0223 07:20:56.982672 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:20:56 crc kubenswrapper[4626]: E0223 07:20:56.983894 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:21:10 crc kubenswrapper[4626]: I0223 07:21:10.982835 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:21:10 crc kubenswrapper[4626]: E0223 07:21:10.983650 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:21:23 crc kubenswrapper[4626]: I0223 07:21:23.983391 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:21:23 crc kubenswrapper[4626]: E0223 07:21:23.984051 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:21:37 crc kubenswrapper[4626]: I0223 07:21:37.988793 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:21:37 crc kubenswrapper[4626]: E0223 07:21:37.989757 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:21:48 crc kubenswrapper[4626]: I0223 07:21:48.983040 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:21:48 crc kubenswrapper[4626]: E0223 07:21:48.984071 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:22:03 crc kubenswrapper[4626]: I0223 07:22:03.982860 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:22:03 crc kubenswrapper[4626]: E0223 07:22:03.983968 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:22:17 crc kubenswrapper[4626]: I0223 07:22:17.988927 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:22:17 crc kubenswrapper[4626]: E0223 07:22:17.990083 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:22:23 crc kubenswrapper[4626]: I0223 07:22:23.262288 4626 generic.go:334] "Generic (PLEG): container finished" podID="a8761a5a-7aba-46e2-9070-49cc7e866c7b" containerID="897f1501f0166a13aaade9baa156ae8e3a10f969ed81447aab8b47a1b91704db" exitCode=0 Feb 23 07:22:23 crc kubenswrapper[4626]: I0223 07:22:23.262368 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" event={"ID":"a8761a5a-7aba-46e2-9070-49cc7e866c7b","Type":"ContainerDied","Data":"897f1501f0166a13aaade9baa156ae8e3a10f969ed81447aab8b47a1b91704db"} Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.627610 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.822817 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-0\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.822949 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-1\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.822990 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bm54\" (UniqueName: \"kubernetes.io/projected/a8761a5a-7aba-46e2-9070-49cc7e866c7b-kube-api-access-8bm54\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.823087 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-telemetry-combined-ca-bundle\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.823173 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ssh-key-openstack-edpm-ipam\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.823220 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-inventory\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.823309 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-2\") pod \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\" (UID: \"a8761a5a-7aba-46e2-9070-49cc7e866c7b\") " Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.830644 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8761a5a-7aba-46e2-9070-49cc7e866c7b-kube-api-access-8bm54" (OuterVolumeSpecName: "kube-api-access-8bm54") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "kube-api-access-8bm54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.856691 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.857685 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.858777 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.860075 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.874215 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.875020 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-inventory" (OuterVolumeSpecName: "inventory") pod "a8761a5a-7aba-46e2-9070-49cc7e866c7b" (UID: "a8761a5a-7aba-46e2-9070-49cc7e866c7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.927806 4626 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.927927 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bm54\" (UniqueName: \"kubernetes.io/projected/a8761a5a-7aba-46e2-9070-49cc7e866c7b-kube-api-access-8bm54\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.928021 4626 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.928108 4626 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.928188 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.928256 4626 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:24 crc kubenswrapper[4626]: I0223 07:22:24.928323 4626 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a8761a5a-7aba-46e2-9070-49cc7e866c7b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 23 07:22:25 crc kubenswrapper[4626]: I0223 07:22:25.285524 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" event={"ID":"a8761a5a-7aba-46e2-9070-49cc7e866c7b","Type":"ContainerDied","Data":"8f560dc098baead995ec70ed7fc7f319a6e5d5830762edec9dd390456ae77cec"} Feb 23 07:22:25 crc kubenswrapper[4626]: I0223 07:22:25.285584 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f560dc098baead995ec70ed7fc7f319a6e5d5830762edec9dd390456ae77cec" Feb 23 07:22:25 crc kubenswrapper[4626]: I0223 07:22:25.285603 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr" Feb 23 07:22:30 crc kubenswrapper[4626]: I0223 07:22:30.982393 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:22:30 crc kubenswrapper[4626]: E0223 07:22:30.982918 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:22:45 crc kubenswrapper[4626]: I0223 07:22:45.982516 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:22:45 crc kubenswrapper[4626]: E0223 07:22:45.983760 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:22:59 crc kubenswrapper[4626]: I0223 07:22:59.983521 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:22:59 crc kubenswrapper[4626]: E0223 07:22:59.984594 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:23:12 crc kubenswrapper[4626]: I0223 07:23:12.982688 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:23:12 crc kubenswrapper[4626]: E0223 07:23:12.983678 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:23:17 crc kubenswrapper[4626]: I0223 07:23:17.624448 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6465458495-hgsdz" podUID="434e199c-4e18-4274-bbaa-f81f2e2a697b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.627311 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Feb 23 07:23:18 crc kubenswrapper[4626]: E0223 07:23:18.628005 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8761a5a-7aba-46e2-9070-49cc7e866c7b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.628021 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8761a5a-7aba-46e2-9070-49cc7e866c7b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.628236 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8761a5a-7aba-46e2-9070-49cc7e866c7b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.628901 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.631959 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.632294 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.632767 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-pdr96" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.633048 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.641352 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781481 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781648 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781692 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781718 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781736 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781776 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781795 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781844 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.781932 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnvws\" (UniqueName: \"kubernetes.io/projected/71976b26-a20d-4173-98a9-e4d5b553fb8b-kube-api-access-qnvws\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884357 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884409 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884467 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884580 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnvws\" (UniqueName: \"kubernetes.io/projected/71976b26-a20d-4173-98a9-e4d5b553fb8b-kube-api-access-qnvws\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884609 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884656 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884693 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884716 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884753 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.884917 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.886057 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.886675 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.886699 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.891017 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.893021 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.893129 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.894287 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.903920 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnvws\" (UniqueName: \"kubernetes.io/projected/71976b26-a20d-4173-98a9-e4d5b553fb8b-kube-api-access-qnvws\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.913754 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:18 crc kubenswrapper[4626]: I0223 07:23:18.950736 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 07:23:19 crc kubenswrapper[4626]: I0223 07:23:19.459748 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Feb 23 07:23:19 crc kubenswrapper[4626]: I0223 07:23:19.817053 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"71976b26-a20d-4173-98a9-e4d5b553fb8b","Type":"ContainerStarted","Data":"cde27dfb87b9b97aa4139bda27c86b3c974fd5f271939f111022754cb8b0ccc1"} Feb 23 07:23:23 crc kubenswrapper[4626]: I0223 07:23:23.984117 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:23:23 crc kubenswrapper[4626]: E0223 07:23:23.985974 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:23:34 crc kubenswrapper[4626]: I0223 07:23:34.983858 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:23:34 crc kubenswrapper[4626]: E0223 07:23:34.984828 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:23:49 crc kubenswrapper[4626]: I0223 07:23:49.982514 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:23:49 crc kubenswrapper[4626]: E0223 07:23:49.984917 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:24:03 crc kubenswrapper[4626]: I0223 07:24:03.982950 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:24:03 crc kubenswrapper[4626]: E0223 07:24:03.983639 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:24:07 crc kubenswrapper[4626]: E0223 07:24:07.382213 4626 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb" Feb 23 07:24:07 crc kubenswrapper[4626]: E0223 07:24:07.382692 4626 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb" Feb 23 07:24:07 crc kubenswrapper[4626]: E0223 07:24:07.385221 4626 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qnvws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-multi-thread-testing_openstack(71976b26-a20d-4173-98a9-e4d5b553fb8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:24:07 crc kubenswrapper[4626]: E0223 07:24:07.386475 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="71976b26-a20d-4173-98a9-e4d5b553fb8b" Feb 23 07:24:08 crc kubenswrapper[4626]: E0223 07:24:08.357049 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="71976b26-a20d-4173-98a9-e4d5b553fb8b" Feb 23 07:24:17 crc kubenswrapper[4626]: I0223 07:24:17.989452 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:24:17 crc kubenswrapper[4626]: E0223 07:24:17.991355 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:24:18 crc kubenswrapper[4626]: I0223 07:24:18.995203 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ptj6b"] Feb 23 07:24:18 crc kubenswrapper[4626]: I0223 07:24:18.998450 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.016649 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-catalog-content\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.016949 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-utilities\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.017085 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc66p\" (UniqueName: \"kubernetes.io/projected/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-kube-api-access-fc66p\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.018736 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ptj6b"] Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.121866 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-catalog-content\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.122000 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-utilities\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.122057 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc66p\" (UniqueName: \"kubernetes.io/projected/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-kube-api-access-fc66p\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.122540 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-utilities\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.122540 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-catalog-content\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.145225 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc66p\" (UniqueName: \"kubernetes.io/projected/b9fa1da3-e2df-48d0-99ca-ea2c952f8c49-kube-api-access-fc66p\") pod \"certified-operators-ptj6b\" (UID: \"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49\") " pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.324838 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:19 crc kubenswrapper[4626]: I0223 07:24:19.691872 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ptj6b"] Feb 23 07:24:20 crc kubenswrapper[4626]: I0223 07:24:20.462406 4626 generic.go:334] "Generic (PLEG): container finished" podID="b9fa1da3-e2df-48d0-99ca-ea2c952f8c49" containerID="0ab957d3549258d2a2aea4eb21df88faeeb4345f3cea000d97b06a8a3e375ba0" exitCode=0 Feb 23 07:24:20 crc kubenswrapper[4626]: I0223 07:24:20.462470 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptj6b" event={"ID":"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49","Type":"ContainerDied","Data":"0ab957d3549258d2a2aea4eb21df88faeeb4345f3cea000d97b06a8a3e375ba0"} Feb 23 07:24:20 crc kubenswrapper[4626]: I0223 07:24:20.462526 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptj6b" event={"ID":"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49","Type":"ContainerStarted","Data":"d72a0900c0438f8ebcc1e059e6ff324e963ff1764d8397375c5a17c9ec049976"} Feb 23 07:24:22 crc kubenswrapper[4626]: I0223 07:24:22.643107 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 23 07:24:24 crc kubenswrapper[4626]: I0223 07:24:24.505423 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"71976b26-a20d-4173-98a9-e4d5b553fb8b","Type":"ContainerStarted","Data":"ce3770cf9d1d69675d9892d368761375d5dc5c0d146efc01f1cbe09d7d308a9b"} Feb 23 07:24:24 crc kubenswrapper[4626]: I0223 07:24:24.522648 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podStartSLOduration=4.355664355 podStartE2EDuration="1m7.522606669s" podCreationTimestamp="2026-02-23 07:23:17 +0000 UTC" firstStartedPulling="2026-02-23 07:23:19.472690543 +0000 UTC m=+2551.812019798" lastFinishedPulling="2026-02-23 07:24:22.639632847 +0000 UTC m=+2614.978962112" observedRunningTime="2026-02-23 07:24:24.521072817 +0000 UTC m=+2616.860402083" watchObservedRunningTime="2026-02-23 07:24:24.522606669 +0000 UTC m=+2616.861935936" Feb 23 07:24:26 crc kubenswrapper[4626]: E0223 07:24:26.416178 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fa1da3_e2df_48d0_99ca_ea2c952f8c49.slice/crio-conmon-be6b7920ddb78db29dbd47facb0d3ffd4915f9f4068bf6b182172c401e9b4824.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:24:26 crc kubenswrapper[4626]: I0223 07:24:26.529069 4626 generic.go:334] "Generic (PLEG): container finished" podID="b9fa1da3-e2df-48d0-99ca-ea2c952f8c49" containerID="be6b7920ddb78db29dbd47facb0d3ffd4915f9f4068bf6b182172c401e9b4824" exitCode=0 Feb 23 07:24:26 crc kubenswrapper[4626]: I0223 07:24:26.529139 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptj6b" event={"ID":"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49","Type":"ContainerDied","Data":"be6b7920ddb78db29dbd47facb0d3ffd4915f9f4068bf6b182172c401e9b4824"} Feb 23 07:24:27 crc kubenswrapper[4626]: I0223 07:24:27.544192 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ptj6b" event={"ID":"b9fa1da3-e2df-48d0-99ca-ea2c952f8c49","Type":"ContainerStarted","Data":"cd95546d35b1576b7cb7623b9a9e34e17c23c11fe79648aa69cf4c351e735b0f"} Feb 23 07:24:27 crc kubenswrapper[4626]: I0223 07:24:27.579680 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ptj6b" podStartSLOduration=2.966501434 podStartE2EDuration="9.57966227s" podCreationTimestamp="2026-02-23 07:24:18 +0000 UTC" firstStartedPulling="2026-02-23 07:24:20.464145554 +0000 UTC m=+2612.803474821" lastFinishedPulling="2026-02-23 07:24:27.0773064 +0000 UTC m=+2619.416635657" observedRunningTime="2026-02-23 07:24:27.566874017 +0000 UTC m=+2619.906203274" watchObservedRunningTime="2026-02-23 07:24:27.57966227 +0000 UTC m=+2619.918991525" Feb 23 07:24:29 crc kubenswrapper[4626]: I0223 07:24:29.325416 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:29 crc kubenswrapper[4626]: I0223 07:24:29.326143 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:29 crc kubenswrapper[4626]: I0223 07:24:29.367707 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:29 crc kubenswrapper[4626]: I0223 07:24:29.983030 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:24:30 crc kubenswrapper[4626]: I0223 07:24:30.581104 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"85ab38e54b615607291465c65e9cd58e607ced847ce6f1429e25bf1ae5a33a06"} Feb 23 07:24:39 crc kubenswrapper[4626]: I0223 07:24:39.366685 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ptj6b" Feb 23 07:24:39 crc kubenswrapper[4626]: I0223 07:24:39.472044 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ptj6b"] Feb 23 07:24:39 crc kubenswrapper[4626]: I0223 07:24:39.499377 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fpznx"] Feb 23 07:24:39 crc kubenswrapper[4626]: I0223 07:24:39.499644 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fpznx" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="registry-server" containerID="cri-o://f576fedf877826fb0e97e31350b22aba1c72ff67d25171bdf9a417159566975b" gracePeriod=2 Feb 23 07:24:39 crc kubenswrapper[4626]: I0223 07:24:39.686573 4626 generic.go:334] "Generic (PLEG): container finished" podID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerID="f576fedf877826fb0e97e31350b22aba1c72ff67d25171bdf9a417159566975b" exitCode=0 Feb 23 07:24:39 crc kubenswrapper[4626]: I0223 07:24:39.687382 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpznx" event={"ID":"0e583bb8-92ca-41bc-bdf4-82032820e1ff","Type":"ContainerDied","Data":"f576fedf877826fb0e97e31350b22aba1c72ff67d25171bdf9a417159566975b"} Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.012562 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.178704 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mvn\" (UniqueName: \"kubernetes.io/projected/0e583bb8-92ca-41bc-bdf4-82032820e1ff-kube-api-access-m7mvn\") pod \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.178921 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-utilities\") pod \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.179011 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-catalog-content\") pod \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\" (UID: \"0e583bb8-92ca-41bc-bdf4-82032820e1ff\") " Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.180467 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-utilities" (OuterVolumeSpecName: "utilities") pod "0e583bb8-92ca-41bc-bdf4-82032820e1ff" (UID: "0e583bb8-92ca-41bc-bdf4-82032820e1ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.186995 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e583bb8-92ca-41bc-bdf4-82032820e1ff-kube-api-access-m7mvn" (OuterVolumeSpecName: "kube-api-access-m7mvn") pod "0e583bb8-92ca-41bc-bdf4-82032820e1ff" (UID: "0e583bb8-92ca-41bc-bdf4-82032820e1ff"). InnerVolumeSpecName "kube-api-access-m7mvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.227321 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e583bb8-92ca-41bc-bdf4-82032820e1ff" (UID: "0e583bb8-92ca-41bc-bdf4-82032820e1ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.282818 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mvn\" (UniqueName: \"kubernetes.io/projected/0e583bb8-92ca-41bc-bdf4-82032820e1ff-kube-api-access-m7mvn\") on node \"crc\" DevicePath \"\"" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.282853 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.282864 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e583bb8-92ca-41bc-bdf4-82032820e1ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.697409 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fpznx" event={"ID":"0e583bb8-92ca-41bc-bdf4-82032820e1ff","Type":"ContainerDied","Data":"16eaf2654483cdb31fbaaa978274f5dff20a9cfe32045bfd66ecfe312879de15"} Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.697469 4626 scope.go:117] "RemoveContainer" containerID="f576fedf877826fb0e97e31350b22aba1c72ff67d25171bdf9a417159566975b" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.697622 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fpznx" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.731018 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fpznx"] Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.737407 4626 scope.go:117] "RemoveContainer" containerID="ffcb7474aaa3ad06ba88b798c87579815df7dccc3d57f4d945a94985a910c9c5" Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.739874 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fpznx"] Feb 23 07:24:40 crc kubenswrapper[4626]: I0223 07:24:40.759140 4626 scope.go:117] "RemoveContainer" containerID="7f59624a371cfbc5c1c954567e1247e354d2ed15c94f6b17b9f97f936e9c3fa2" Feb 23 07:24:41 crc kubenswrapper[4626]: I0223 07:24:41.995520 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" path="/var/lib/kubelet/pods/0e583bb8-92ca-41bc-bdf4-82032820e1ff/volumes" Feb 23 07:26:15 crc kubenswrapper[4626]: E0223 07:26:15.990427 4626 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.58:52974->192.168.26.58:46805: read tcp 192.168.26.58:52974->192.168.26.58:46805: read: connection reset by peer Feb 23 07:26:55 crc kubenswrapper[4626]: I0223 07:26:55.685523 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:26:55 crc kubenswrapper[4626]: I0223 07:26:55.686776 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:27:25 crc kubenswrapper[4626]: I0223 07:27:25.685807 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:27:25 crc kubenswrapper[4626]: I0223 07:27:25.686786 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.049333 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zqm4j"] Feb 23 07:27:37 crc kubenswrapper[4626]: E0223 07:27:37.053240 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="registry-server" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.053355 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="registry-server" Feb 23 07:27:37 crc kubenswrapper[4626]: E0223 07:27:37.053831 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="extract-content" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.053855 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="extract-content" Feb 23 07:27:37 crc kubenswrapper[4626]: E0223 07:27:37.053879 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="extract-utilities" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.053888 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="extract-utilities" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.054631 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e583bb8-92ca-41bc-bdf4-82032820e1ff" containerName="registry-server" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.058656 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.133652 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zqm4j"] Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.164196 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.166272 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-utilities\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.166401 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxlk7\" (UniqueName: \"kubernetes.io/projected/b1437afa-b67e-486f-96c1-2c18eefded03-kube-api-access-dxlk7\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.270305 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.270733 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-utilities\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.271073 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxlk7\" (UniqueName: \"kubernetes.io/projected/b1437afa-b67e-486f-96c1-2c18eefded03-kube-api-access-dxlk7\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.274432 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.275576 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-utilities\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.301522 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxlk7\" (UniqueName: \"kubernetes.io/projected/b1437afa-b67e-486f-96c1-2c18eefded03-kube-api-access-dxlk7\") pod \"community-operators-zqm4j\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:37 crc kubenswrapper[4626]: I0223 07:27:37.380599 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:38 crc kubenswrapper[4626]: I0223 07:27:38.227079 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zqm4j"] Feb 23 07:27:38 crc kubenswrapper[4626]: I0223 07:27:38.336147 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerStarted","Data":"7cbd413d343f2a6e5f034d2c38e87511db0dac59e35ad2523202912d3b92b882"} Feb 23 07:27:39 crc kubenswrapper[4626]: I0223 07:27:39.363328 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerDied","Data":"00adc67f1a1ece3be40c2e42905dd4962461bb688b80701ddf07e440700feb3a"} Feb 23 07:27:39 crc kubenswrapper[4626]: I0223 07:27:39.363597 4626 generic.go:334] "Generic (PLEG): container finished" podID="b1437afa-b67e-486f-96c1-2c18eefded03" containerID="00adc67f1a1ece3be40c2e42905dd4962461bb688b80701ddf07e440700feb3a" exitCode=0 Feb 23 07:27:39 crc kubenswrapper[4626]: I0223 07:27:39.373117 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.327530 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vvcgs"] Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.329404 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.350720 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvcgs"] Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.437194 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-utilities\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.437256 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drcd\" (UniqueName: \"kubernetes.io/projected/f2aa552b-c93c-44f9-88d8-5641e54e129a-kube-api-access-7drcd\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.437567 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-catalog-content\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.540956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-utilities\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.541034 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drcd\" (UniqueName: \"kubernetes.io/projected/f2aa552b-c93c-44f9-88d8-5641e54e129a-kube-api-access-7drcd\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.541192 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-catalog-content\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.544395 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-utilities\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.544976 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-catalog-content\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.567485 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drcd\" (UniqueName: \"kubernetes.io/projected/f2aa552b-c93c-44f9-88d8-5641e54e129a-kube-api-access-7drcd\") pod \"redhat-marketplace-vvcgs\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.657097 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.732368 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x4r7v"] Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.734341 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.763031 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x4r7v"] Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.853666 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-utilities\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.853704 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-catalog-content\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.853976 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnm4\" (UniqueName: \"kubernetes.io/projected/37327461-82a8-4ebb-b2cc-1e3e17599334-kube-api-access-wwnm4\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.956639 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnm4\" (UniqueName: \"kubernetes.io/projected/37327461-82a8-4ebb-b2cc-1e3e17599334-kube-api-access-wwnm4\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.956729 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-utilities\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.956755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-catalog-content\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.957654 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-catalog-content\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.959986 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-utilities\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:40 crc kubenswrapper[4626]: I0223 07:27:40.986031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnm4\" (UniqueName: \"kubernetes.io/projected/37327461-82a8-4ebb-b2cc-1e3e17599334-kube-api-access-wwnm4\") pod \"redhat-operators-x4r7v\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:41 crc kubenswrapper[4626]: I0223 07:27:41.065003 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:41 crc kubenswrapper[4626]: I0223 07:27:41.204850 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvcgs"] Feb 23 07:27:41 crc kubenswrapper[4626]: I0223 07:27:41.386355 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerStarted","Data":"391e4bac298081b88300ff8ef10a08795e67948039fe1c1467b90b1e602907d5"} Feb 23 07:27:41 crc kubenswrapper[4626]: I0223 07:27:41.389373 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerStarted","Data":"b874055d0b4b8497513b663873148eba7842242cb212f00e4d22bedc475ed1ef"} Feb 23 07:27:41 crc kubenswrapper[4626]: I0223 07:27:41.567151 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x4r7v"] Feb 23 07:27:41 crc kubenswrapper[4626]: W0223 07:27:41.577522 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37327461_82a8_4ebb_b2cc_1e3e17599334.slice/crio-f15bbe23a9b8c994b4763e375acfff6ca7894472382b1dc897c012d629c7c01f WatchSource:0}: Error finding container f15bbe23a9b8c994b4763e375acfff6ca7894472382b1dc897c012d629c7c01f: Status 404 returned error can't find the container with id f15bbe23a9b8c994b4763e375acfff6ca7894472382b1dc897c012d629c7c01f Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.420611 4626 generic.go:334] "Generic (PLEG): container finished" podID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerID="607ad686d6047fc0343f08d88207574b6551d592f4ca1d4e12ab1b776e369dca" exitCode=0 Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.421512 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerDied","Data":"607ad686d6047fc0343f08d88207574b6551d592f4ca1d4e12ab1b776e369dca"} Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.421559 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerStarted","Data":"f15bbe23a9b8c994b4763e375acfff6ca7894472382b1dc897c012d629c7c01f"} Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.425709 4626 generic.go:334] "Generic (PLEG): container finished" podID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerID="f411333b41409b2d97029cb7ddf273cae3d05a285d92635af3c6e24ce2ada26a" exitCode=0 Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.425788 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerDied","Data":"f411333b41409b2d97029cb7ddf273cae3d05a285d92635af3c6e24ce2ada26a"} Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.429226 4626 generic.go:334] "Generic (PLEG): container finished" podID="b1437afa-b67e-486f-96c1-2c18eefded03" containerID="b874055d0b4b8497513b663873148eba7842242cb212f00e4d22bedc475ed1ef" exitCode=0 Feb 23 07:27:42 crc kubenswrapper[4626]: I0223 07:27:42.429269 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerDied","Data":"b874055d0b4b8497513b663873148eba7842242cb212f00e4d22bedc475ed1ef"} Feb 23 07:27:43 crc kubenswrapper[4626]: I0223 07:27:43.444061 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerStarted","Data":"7e02d6d960b6eeeac4320d504cc4d2b6a5d7df5c5f18c97b7687a978fb88d4a4"} Feb 23 07:27:43 crc kubenswrapper[4626]: I0223 07:27:43.446365 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerStarted","Data":"c35535a4d2aaefd340b99b9a4131945769469bf4239f833556e087fe25f5b2fb"} Feb 23 07:27:43 crc kubenswrapper[4626]: I0223 07:27:43.449838 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerStarted","Data":"a3ef1580aa75e2a754a9591ac5df78e3542f54ee0b60844c8ff353f993d169da"} Feb 23 07:27:43 crc kubenswrapper[4626]: I0223 07:27:43.481573 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zqm4j" podStartSLOduration=3.932006855 podStartE2EDuration="7.481557372s" podCreationTimestamp="2026-02-23 07:27:36 +0000 UTC" firstStartedPulling="2026-02-23 07:27:39.369560305 +0000 UTC m=+2811.708889571" lastFinishedPulling="2026-02-23 07:27:42.919110822 +0000 UTC m=+2815.258440088" observedRunningTime="2026-02-23 07:27:43.472802354 +0000 UTC m=+2815.812131611" watchObservedRunningTime="2026-02-23 07:27:43.481557372 +0000 UTC m=+2815.820886639" Feb 23 07:27:44 crc kubenswrapper[4626]: I0223 07:27:44.458975 4626 generic.go:334] "Generic (PLEG): container finished" podID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerID="c35535a4d2aaefd340b99b9a4131945769469bf4239f833556e087fe25f5b2fb" exitCode=0 Feb 23 07:27:44 crc kubenswrapper[4626]: I0223 07:27:44.459052 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerDied","Data":"c35535a4d2aaefd340b99b9a4131945769469bf4239f833556e087fe25f5b2fb"} Feb 23 07:27:45 crc kubenswrapper[4626]: I0223 07:27:45.469990 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerStarted","Data":"611484c96ef9cb8c09fb27d04dc43cc4dbbd053f0e0f8f07abf4bf4e770b7581"} Feb 23 07:27:45 crc kubenswrapper[4626]: I0223 07:27:45.489510 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vvcgs" podStartSLOduration=3.018931286 podStartE2EDuration="5.489476602s" podCreationTimestamp="2026-02-23 07:27:40 +0000 UTC" firstStartedPulling="2026-02-23 07:27:42.42801834 +0000 UTC m=+2814.767347607" lastFinishedPulling="2026-02-23 07:27:44.898563658 +0000 UTC m=+2817.237892923" observedRunningTime="2026-02-23 07:27:45.4848723 +0000 UTC m=+2817.824201566" watchObservedRunningTime="2026-02-23 07:27:45.489476602 +0000 UTC m=+2817.828805868" Feb 23 07:27:47 crc kubenswrapper[4626]: I0223 07:27:47.381489 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:47 crc kubenswrapper[4626]: I0223 07:27:47.381737 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:47 crc kubenswrapper[4626]: I0223 07:27:47.486467 4626 generic.go:334] "Generic (PLEG): container finished" podID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerID="7e02d6d960b6eeeac4320d504cc4d2b6a5d7df5c5f18c97b7687a978fb88d4a4" exitCode=0 Feb 23 07:27:47 crc kubenswrapper[4626]: I0223 07:27:47.486523 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerDied","Data":"7e02d6d960b6eeeac4320d504cc4d2b6a5d7df5c5f18c97b7687a978fb88d4a4"} Feb 23 07:27:48 crc kubenswrapper[4626]: I0223 07:27:48.446569 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zqm4j" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="registry-server" probeResult="failure" output=< Feb 23 07:27:48 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:27:48 crc kubenswrapper[4626]: > Feb 23 07:27:48 crc kubenswrapper[4626]: I0223 07:27:48.495900 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerStarted","Data":"51ddadaf61de80f400e1f054560a9197dee2cbbda6565f8090c674d4c22f1ca7"} Feb 23 07:27:48 crc kubenswrapper[4626]: I0223 07:27:48.531095 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x4r7v" podStartSLOduration=2.958969996 podStartE2EDuration="8.531075037s" podCreationTimestamp="2026-02-23 07:27:40 +0000 UTC" firstStartedPulling="2026-02-23 07:27:42.423369885 +0000 UTC m=+2814.762699151" lastFinishedPulling="2026-02-23 07:27:47.995474926 +0000 UTC m=+2820.334804192" observedRunningTime="2026-02-23 07:27:48.526013484 +0000 UTC m=+2820.865342751" watchObservedRunningTime="2026-02-23 07:27:48.531075037 +0000 UTC m=+2820.870404293" Feb 23 07:27:50 crc kubenswrapper[4626]: I0223 07:27:50.657455 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:50 crc kubenswrapper[4626]: I0223 07:27:50.658638 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:27:51 crc kubenswrapper[4626]: I0223 07:27:51.065823 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:51 crc kubenswrapper[4626]: I0223 07:27:51.065889 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:27:51 crc kubenswrapper[4626]: I0223 07:27:51.702974 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vvcgs" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="registry-server" probeResult="failure" output=< Feb 23 07:27:51 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:27:51 crc kubenswrapper[4626]: > Feb 23 07:27:52 crc kubenswrapper[4626]: I0223 07:27:52.117082 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x4r7v" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" probeResult="failure" output=< Feb 23 07:27:52 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:27:52 crc kubenswrapper[4626]: > Feb 23 07:27:55 crc kubenswrapper[4626]: I0223 07:27:55.686192 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:27:55 crc kubenswrapper[4626]: I0223 07:27:55.687911 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:27:55 crc kubenswrapper[4626]: I0223 07:27:55.687994 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:27:55 crc kubenswrapper[4626]: I0223 07:27:55.690563 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85ab38e54b615607291465c65e9cd58e607ced847ce6f1429e25bf1ae5a33a06"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:27:55 crc kubenswrapper[4626]: I0223 07:27:55.691333 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://85ab38e54b615607291465c65e9cd58e607ced847ce6f1429e25bf1ae5a33a06" gracePeriod=600 Feb 23 07:27:56 crc kubenswrapper[4626]: I0223 07:27:56.564854 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"85ab38e54b615607291465c65e9cd58e607ced847ce6f1429e25bf1ae5a33a06"} Feb 23 07:27:56 crc kubenswrapper[4626]: I0223 07:27:56.565157 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="85ab38e54b615607291465c65e9cd58e607ced847ce6f1429e25bf1ae5a33a06" exitCode=0 Feb 23 07:27:56 crc kubenswrapper[4626]: I0223 07:27:56.565632 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac"} Feb 23 07:27:56 crc kubenswrapper[4626]: I0223 07:27:56.568118 4626 scope.go:117] "RemoveContainer" containerID="cab33f814615b6f458da6f63d43d775f32f4f568c25b5442397c06f7ac4c7b10" Feb 23 07:27:57 crc kubenswrapper[4626]: I0223 07:27:57.714839 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:57 crc kubenswrapper[4626]: I0223 07:27:57.803838 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:27:59 crc kubenswrapper[4626]: I0223 07:27:59.916245 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zqm4j"] Feb 23 07:27:59 crc kubenswrapper[4626]: I0223 07:27:59.919464 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zqm4j" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="registry-server" containerID="cri-o://a3ef1580aa75e2a754a9591ac5df78e3542f54ee0b60844c8ff353f993d169da" gracePeriod=2 Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.609107 4626 generic.go:334] "Generic (PLEG): container finished" podID="b1437afa-b67e-486f-96c1-2c18eefded03" containerID="a3ef1580aa75e2a754a9591ac5df78e3542f54ee0b60844c8ff353f993d169da" exitCode=0 Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.609232 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerDied","Data":"a3ef1580aa75e2a754a9591ac5df78e3542f54ee0b60844c8ff353f993d169da"} Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.717926 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.768347 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.900738 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.991785 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-utilities\") pod \"b1437afa-b67e-486f-96c1-2c18eefded03\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.992903 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxlk7\" (UniqueName: \"kubernetes.io/projected/b1437afa-b67e-486f-96c1-2c18eefded03-kube-api-access-dxlk7\") pod \"b1437afa-b67e-486f-96c1-2c18eefded03\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.993140 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content\") pod \"b1437afa-b67e-486f-96c1-2c18eefded03\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " Feb 23 07:28:00 crc kubenswrapper[4626]: I0223 07:28:00.994025 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-utilities" (OuterVolumeSpecName: "utilities") pod "b1437afa-b67e-486f-96c1-2c18eefded03" (UID: "b1437afa-b67e-486f-96c1-2c18eefded03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.011251 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1437afa-b67e-486f-96c1-2c18eefded03-kube-api-access-dxlk7" (OuterVolumeSpecName: "kube-api-access-dxlk7") pod "b1437afa-b67e-486f-96c1-2c18eefded03" (UID: "b1437afa-b67e-486f-96c1-2c18eefded03"). InnerVolumeSpecName "kube-api-access-dxlk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.095806 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1437afa-b67e-486f-96c1-2c18eefded03" (UID: "b1437afa-b67e-486f-96c1-2c18eefded03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.098737 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content\") pod \"b1437afa-b67e-486f-96c1-2c18eefded03\" (UID: \"b1437afa-b67e-486f-96c1-2c18eefded03\") " Feb 23 07:28:01 crc kubenswrapper[4626]: W0223 07:28:01.099751 4626 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b1437afa-b67e-486f-96c1-2c18eefded03/volumes/kubernetes.io~empty-dir/catalog-content Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.100068 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1437afa-b67e-486f-96c1-2c18eefded03" (UID: "b1437afa-b67e-486f-96c1-2c18eefded03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.100461 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.100597 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxlk7\" (UniqueName: \"kubernetes.io/projected/b1437afa-b67e-486f-96c1-2c18eefded03-kube-api-access-dxlk7\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.101454 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1437afa-b67e-486f-96c1-2c18eefded03-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.622546 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqm4j" event={"ID":"b1437afa-b67e-486f-96c1-2c18eefded03","Type":"ContainerDied","Data":"7cbd413d343f2a6e5f034d2c38e87511db0dac59e35ad2523202912d3b92b882"} Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.622848 4626 scope.go:117] "RemoveContainer" containerID="a3ef1580aa75e2a754a9591ac5df78e3542f54ee0b60844c8ff353f993d169da" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.622581 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqm4j" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.655150 4626 scope.go:117] "RemoveContainer" containerID="b874055d0b4b8497513b663873148eba7842242cb212f00e4d22bedc475ed1ef" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.672851 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zqm4j"] Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.692544 4626 scope.go:117] "RemoveContainer" containerID="00adc67f1a1ece3be40c2e42905dd4962461bb688b80701ddf07e440700feb3a" Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.701298 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zqm4j"] Feb 23 07:28:01 crc kubenswrapper[4626]: I0223 07:28:01.992312 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" path="/var/lib/kubelet/pods/b1437afa-b67e-486f-96c1-2c18eefded03/volumes" Feb 23 07:28:02 crc kubenswrapper[4626]: I0223 07:28:02.105110 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x4r7v" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" probeResult="failure" output=< Feb 23 07:28:02 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:28:02 crc kubenswrapper[4626]: > Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.121881 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvcgs"] Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.122195 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vvcgs" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="registry-server" containerID="cri-o://611484c96ef9cb8c09fb27d04dc43cc4dbbd053f0e0f8f07abf4bf4e770b7581" gracePeriod=2 Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.662989 4626 generic.go:334] "Generic (PLEG): container finished" podID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerID="611484c96ef9cb8c09fb27d04dc43cc4dbbd053f0e0f8f07abf4bf4e770b7581" exitCode=0 Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.663066 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerDied","Data":"611484c96ef9cb8c09fb27d04dc43cc4dbbd053f0e0f8f07abf4bf4e770b7581"} Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.802526 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.972014 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-catalog-content\") pod \"f2aa552b-c93c-44f9-88d8-5641e54e129a\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.972558 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drcd\" (UniqueName: \"kubernetes.io/projected/f2aa552b-c93c-44f9-88d8-5641e54e129a-kube-api-access-7drcd\") pod \"f2aa552b-c93c-44f9-88d8-5641e54e129a\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.972783 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-utilities\") pod \"f2aa552b-c93c-44f9-88d8-5641e54e129a\" (UID: \"f2aa552b-c93c-44f9-88d8-5641e54e129a\") " Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.973814 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-utilities" (OuterVolumeSpecName: "utilities") pod "f2aa552b-c93c-44f9-88d8-5641e54e129a" (UID: "f2aa552b-c93c-44f9-88d8-5641e54e129a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.990993 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2aa552b-c93c-44f9-88d8-5641e54e129a-kube-api-access-7drcd" (OuterVolumeSpecName: "kube-api-access-7drcd") pod "f2aa552b-c93c-44f9-88d8-5641e54e129a" (UID: "f2aa552b-c93c-44f9-88d8-5641e54e129a"). InnerVolumeSpecName "kube-api-access-7drcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:28:03 crc kubenswrapper[4626]: I0223 07:28:03.998679 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2aa552b-c93c-44f9-88d8-5641e54e129a" (UID: "f2aa552b-c93c-44f9-88d8-5641e54e129a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.075466 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.075519 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drcd\" (UniqueName: \"kubernetes.io/projected/f2aa552b-c93c-44f9-88d8-5641e54e129a-kube-api-access-7drcd\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.075536 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2aa552b-c93c-44f9-88d8-5641e54e129a-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.673849 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvcgs" event={"ID":"f2aa552b-c93c-44f9-88d8-5641e54e129a","Type":"ContainerDied","Data":"391e4bac298081b88300ff8ef10a08795e67948039fe1c1467b90b1e602907d5"} Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.673903 4626 scope.go:117] "RemoveContainer" containerID="611484c96ef9cb8c09fb27d04dc43cc4dbbd053f0e0f8f07abf4bf4e770b7581" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.674049 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvcgs" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.701989 4626 scope.go:117] "RemoveContainer" containerID="c35535a4d2aaefd340b99b9a4131945769469bf4239f833556e087fe25f5b2fb" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.710661 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvcgs"] Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.735036 4626 scope.go:117] "RemoveContainer" containerID="f411333b41409b2d97029cb7ddf273cae3d05a285d92635af3c6e24ce2ada26a" Feb 23 07:28:04 crc kubenswrapper[4626]: I0223 07:28:04.741045 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvcgs"] Feb 23 07:28:05 crc kubenswrapper[4626]: I0223 07:28:05.992821 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" path="/var/lib/kubelet/pods/f2aa552b-c93c-44f9-88d8-5641e54e129a/volumes" Feb 23 07:28:12 crc kubenswrapper[4626]: I0223 07:28:12.100035 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x4r7v" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" probeResult="failure" output=< Feb 23 07:28:12 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:28:12 crc kubenswrapper[4626]: > Feb 23 07:28:21 crc kubenswrapper[4626]: I0223 07:28:21.141313 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:28:21 crc kubenswrapper[4626]: I0223 07:28:21.190339 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:28:21 crc kubenswrapper[4626]: I0223 07:28:21.645450 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x4r7v"] Feb 23 07:28:22 crc kubenswrapper[4626]: I0223 07:28:22.848305 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x4r7v" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" containerID="cri-o://51ddadaf61de80f400e1f054560a9197dee2cbbda6565f8090c674d4c22f1ca7" gracePeriod=2 Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:23.859561 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerDied","Data":"51ddadaf61de80f400e1f054560a9197dee2cbbda6565f8090c674d4c22f1ca7"} Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:23.859622 4626 generic.go:334] "Generic (PLEG): container finished" podID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerID="51ddadaf61de80f400e1f054560a9197dee2cbbda6565f8090c674d4c22f1ca7" exitCode=0 Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:23.859995 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x4r7v" event={"ID":"37327461-82a8-4ebb-b2cc-1e3e17599334","Type":"ContainerDied","Data":"f15bbe23a9b8c994b4763e375acfff6ca7894472382b1dc897c012d629c7c01f"} Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:23.860022 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f15bbe23a9b8c994b4763e375acfff6ca7894472382b1dc897c012d629c7c01f" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:23.916984 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.083709 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-catalog-content\") pod \"37327461-82a8-4ebb-b2cc-1e3e17599334\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.083775 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwnm4\" (UniqueName: \"kubernetes.io/projected/37327461-82a8-4ebb-b2cc-1e3e17599334-kube-api-access-wwnm4\") pod \"37327461-82a8-4ebb-b2cc-1e3e17599334\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.083848 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-utilities\") pod \"37327461-82a8-4ebb-b2cc-1e3e17599334\" (UID: \"37327461-82a8-4ebb-b2cc-1e3e17599334\") " Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.091845 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-utilities" (OuterVolumeSpecName: "utilities") pod "37327461-82a8-4ebb-b2cc-1e3e17599334" (UID: "37327461-82a8-4ebb-b2cc-1e3e17599334"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.125812 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37327461-82a8-4ebb-b2cc-1e3e17599334-kube-api-access-wwnm4" (OuterVolumeSpecName: "kube-api-access-wwnm4") pod "37327461-82a8-4ebb-b2cc-1e3e17599334" (UID: "37327461-82a8-4ebb-b2cc-1e3e17599334"). InnerVolumeSpecName "kube-api-access-wwnm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.186461 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.186489 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwnm4\" (UniqueName: \"kubernetes.io/projected/37327461-82a8-4ebb-b2cc-1e3e17599334-kube-api-access-wwnm4\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.233240 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37327461-82a8-4ebb-b2cc-1e3e17599334" (UID: "37327461-82a8-4ebb-b2cc-1e3e17599334"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.289087 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37327461-82a8-4ebb-b2cc-1e3e17599334-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.870904 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x4r7v" Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.908668 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x4r7v"] Feb 23 07:28:24 crc kubenswrapper[4626]: I0223 07:28:24.956608 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x4r7v"] Feb 23 07:28:25 crc kubenswrapper[4626]: I0223 07:28:25.991483 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" path="/var/lib/kubelet/pods/37327461-82a8-4ebb-b2cc-1e3e17599334/volumes" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.551529 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d"] Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556054 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556077 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556524 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556538 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556557 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556563 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556583 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556591 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556605 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556610 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556624 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556630 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556653 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556658 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556669 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556674 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: E0223 07:30:00.556698 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.556703 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.557601 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2aa552b-c93c-44f9-88d8-5641e54e129a" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.557741 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="37327461-82a8-4ebb-b2cc-1e3e17599334" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.557762 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1437afa-b67e-486f-96c1-2c18eefded03" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.561918 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.571185 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.590266 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce8553bf-f699-41d1-b085-de32675318d6-secret-volume\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.590405 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rccq6\" (UniqueName: \"kubernetes.io/projected/ce8553bf-f699-41d1-b085-de32675318d6-kube-api-access-rccq6\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.590666 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce8553bf-f699-41d1-b085-de32675318d6-config-volume\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.599462 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.673752 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d"] Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.691981 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rccq6\" (UniqueName: \"kubernetes.io/projected/ce8553bf-f699-41d1-b085-de32675318d6-kube-api-access-rccq6\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.692133 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce8553bf-f699-41d1-b085-de32675318d6-config-volume\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.692195 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce8553bf-f699-41d1-b085-de32675318d6-secret-volume\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.695533 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce8553bf-f699-41d1-b085-de32675318d6-config-volume\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.710746 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce8553bf-f699-41d1-b085-de32675318d6-secret-volume\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.712267 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rccq6\" (UniqueName: \"kubernetes.io/projected/ce8553bf-f699-41d1-b085-de32675318d6-kube-api-access-rccq6\") pod \"collect-profiles-29530530-8f76d\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:00 crc kubenswrapper[4626]: I0223 07:30:00.886910 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:01 crc kubenswrapper[4626]: I0223 07:30:01.625087 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d"] Feb 23 07:30:02 crc kubenswrapper[4626]: I0223 07:30:02.587352 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" event={"ID":"ce8553bf-f699-41d1-b085-de32675318d6","Type":"ContainerDied","Data":"2f70721d91afcc09a7dcdba6a750563c8a5d2edeee33e28570d0815a8959fec6"} Feb 23 07:30:02 crc kubenswrapper[4626]: I0223 07:30:02.587732 4626 generic.go:334] "Generic (PLEG): container finished" podID="ce8553bf-f699-41d1-b085-de32675318d6" containerID="2f70721d91afcc09a7dcdba6a750563c8a5d2edeee33e28570d0815a8959fec6" exitCode=0 Feb 23 07:30:02 crc kubenswrapper[4626]: I0223 07:30:02.587862 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" event={"ID":"ce8553bf-f699-41d1-b085-de32675318d6","Type":"ContainerStarted","Data":"fab6060675d63c8b53ac93867a28840baff66638c599507428d82e23d5ef7ba2"} Feb 23 07:30:03 crc kubenswrapper[4626]: I0223 07:30:03.942817 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.042653 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce8553bf-f699-41d1-b085-de32675318d6-config-volume\") pod \"ce8553bf-f699-41d1-b085-de32675318d6\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.042735 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce8553bf-f699-41d1-b085-de32675318d6-secret-volume\") pod \"ce8553bf-f699-41d1-b085-de32675318d6\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.042790 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rccq6\" (UniqueName: \"kubernetes.io/projected/ce8553bf-f699-41d1-b085-de32675318d6-kube-api-access-rccq6\") pod \"ce8553bf-f699-41d1-b085-de32675318d6\" (UID: \"ce8553bf-f699-41d1-b085-de32675318d6\") " Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.045083 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8553bf-f699-41d1-b085-de32675318d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce8553bf-f699-41d1-b085-de32675318d6" (UID: "ce8553bf-f699-41d1-b085-de32675318d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.052704 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8553bf-f699-41d1-b085-de32675318d6-kube-api-access-rccq6" (OuterVolumeSpecName: "kube-api-access-rccq6") pod "ce8553bf-f699-41d1-b085-de32675318d6" (UID: "ce8553bf-f699-41d1-b085-de32675318d6"). InnerVolumeSpecName "kube-api-access-rccq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.057582 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce8553bf-f699-41d1-b085-de32675318d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ce8553bf-f699-41d1-b085-de32675318d6" (UID: "ce8553bf-f699-41d1-b085-de32675318d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.145907 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce8553bf-f699-41d1-b085-de32675318d6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.145944 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce8553bf-f699-41d1-b085-de32675318d6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.145954 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rccq6\" (UniqueName: \"kubernetes.io/projected/ce8553bf-f699-41d1-b085-de32675318d6-kube-api-access-rccq6\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.602484 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" event={"ID":"ce8553bf-f699-41d1-b085-de32675318d6","Type":"ContainerDied","Data":"fab6060675d63c8b53ac93867a28840baff66638c599507428d82e23d5ef7ba2"} Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.602857 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d" Feb 23 07:30:04 crc kubenswrapper[4626]: I0223 07:30:04.602593 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab6060675d63c8b53ac93867a28840baff66638c599507428d82e23d5ef7ba2" Feb 23 07:30:05 crc kubenswrapper[4626]: I0223 07:30:05.025480 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf"] Feb 23 07:30:05 crc kubenswrapper[4626]: I0223 07:30:05.033238 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-lhsnf"] Feb 23 07:30:05 crc kubenswrapper[4626]: I0223 07:30:05.992186 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2034397-268c-4190-8b7a-5c65db1eebb9" path="/var/lib/kubelet/pods/f2034397-268c-4190-8b7a-5c65db1eebb9/volumes" Feb 23 07:30:21 crc kubenswrapper[4626]: I0223 07:30:21.575545 4626 scope.go:117] "RemoveContainer" containerID="5c567ddfbada4f38d8ed50afba763158b44c39b04642fcffd752121387fb4c06" Feb 23 07:30:25 crc kubenswrapper[4626]: I0223 07:30:25.685380 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:30:25 crc kubenswrapper[4626]: I0223 07:30:25.686160 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:30:55 crc kubenswrapper[4626]: I0223 07:30:55.685836 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:30:55 crc kubenswrapper[4626]: I0223 07:30:55.686437 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:31:25 crc kubenswrapper[4626]: I0223 07:31:25.684971 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:31:25 crc kubenswrapper[4626]: I0223 07:31:25.685404 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:31:25 crc kubenswrapper[4626]: I0223 07:31:25.686687 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:31:25 crc kubenswrapper[4626]: I0223 07:31:25.688649 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:31:25 crc kubenswrapper[4626]: I0223 07:31:25.688966 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" gracePeriod=600 Feb 23 07:31:25 crc kubenswrapper[4626]: E0223 07:31:25.822395 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:31:26 crc kubenswrapper[4626]: I0223 07:31:26.199103 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" exitCode=0 Feb 23 07:31:26 crc kubenswrapper[4626]: I0223 07:31:26.199434 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac"} Feb 23 07:31:26 crc kubenswrapper[4626]: I0223 07:31:26.201298 4626 scope.go:117] "RemoveContainer" containerID="85ab38e54b615607291465c65e9cd58e607ced847ce6f1429e25bf1ae5a33a06" Feb 23 07:31:26 crc kubenswrapper[4626]: I0223 07:31:26.201410 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:31:26 crc kubenswrapper[4626]: E0223 07:31:26.201853 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:31:38 crc kubenswrapper[4626]: I0223 07:31:38.982543 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:31:38 crc kubenswrapper[4626]: E0223 07:31:38.983256 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:31:51 crc kubenswrapper[4626]: I0223 07:31:51.982436 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:31:51 crc kubenswrapper[4626]: E0223 07:31:51.983219 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:32:03 crc kubenswrapper[4626]: I0223 07:32:03.982703 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:32:03 crc kubenswrapper[4626]: E0223 07:32:03.983283 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:32:15 crc kubenswrapper[4626]: I0223 07:32:15.996078 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:32:15 crc kubenswrapper[4626]: E0223 07:32:15.999394 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:32:23 crc kubenswrapper[4626]: E0223 07:32:23.336536 4626 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.58:46712->192.168.26.58:46805: read tcp 192.168.26.58:46712->192.168.26.58:46805: read: connection reset by peer Feb 23 07:32:29 crc kubenswrapper[4626]: I0223 07:32:29.982047 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:32:29 crc kubenswrapper[4626]: E0223 07:32:29.982606 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:32:43 crc kubenswrapper[4626]: I0223 07:32:43.982217 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:32:43 crc kubenswrapper[4626]: E0223 07:32:43.982769 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:32:57 crc kubenswrapper[4626]: I0223 07:32:57.986921 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:32:57 crc kubenswrapper[4626]: E0223 07:32:57.987747 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:33:10 crc kubenswrapper[4626]: I0223 07:33:10.982339 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:33:10 crc kubenswrapper[4626]: E0223 07:33:10.983103 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:33:25 crc kubenswrapper[4626]: I0223 07:33:25.982311 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:33:25 crc kubenswrapper[4626]: E0223 07:33:25.982881 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:33:36 crc kubenswrapper[4626]: I0223 07:33:36.981731 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:33:36 crc kubenswrapper[4626]: E0223 07:33:36.982371 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:33:49 crc kubenswrapper[4626]: I0223 07:33:49.981454 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:33:49 crc kubenswrapper[4626]: E0223 07:33:49.982100 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:34:04 crc kubenswrapper[4626]: I0223 07:34:04.982549 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:34:04 crc kubenswrapper[4626]: E0223 07:34:04.983848 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:34:16 crc kubenswrapper[4626]: I0223 07:34:16.982864 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:34:16 crc kubenswrapper[4626]: E0223 07:34:16.983560 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:34:21 crc kubenswrapper[4626]: I0223 07:34:21.768372 4626 scope.go:117] "RemoveContainer" containerID="607ad686d6047fc0343f08d88207574b6551d592f4ca1d4e12ab1b776e369dca" Feb 23 07:34:21 crc kubenswrapper[4626]: I0223 07:34:21.826567 4626 scope.go:117] "RemoveContainer" containerID="7e02d6d960b6eeeac4320d504cc4d2b6a5d7df5c5f18c97b7687a978fb88d4a4" Feb 23 07:34:21 crc kubenswrapper[4626]: I0223 07:34:21.852770 4626 scope.go:117] "RemoveContainer" containerID="51ddadaf61de80f400e1f054560a9197dee2cbbda6565f8090c674d4c22f1ca7" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.244192 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5w4r8"] Feb 23 07:34:23 crc kubenswrapper[4626]: E0223 07:34:23.250338 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8553bf-f699-41d1-b085-de32675318d6" containerName="collect-profiles" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.250378 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8553bf-f699-41d1-b085-de32675318d6" containerName="collect-profiles" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.251539 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8553bf-f699-41d1-b085-de32675318d6" containerName="collect-profiles" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.256564 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.346205 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5w4r8"] Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.369529 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhjr2\" (UniqueName: \"kubernetes.io/projected/2ef3d1d4-1968-4892-8b32-36782f769992-kube-api-access-jhjr2\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.369873 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-utilities\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.369936 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-catalog-content\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.472597 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-utilities\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.472675 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-catalog-content\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.472844 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhjr2\" (UniqueName: \"kubernetes.io/projected/2ef3d1d4-1968-4892-8b32-36782f769992-kube-api-access-jhjr2\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.476594 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-utilities\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.476920 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-catalog-content\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.498711 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhjr2\" (UniqueName: \"kubernetes.io/projected/2ef3d1d4-1968-4892-8b32-36782f769992-kube-api-access-jhjr2\") pod \"certified-operators-5w4r8\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:23 crc kubenswrapper[4626]: I0223 07:34:23.581225 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:24 crc kubenswrapper[4626]: I0223 07:34:24.284485 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5w4r8"] Feb 23 07:34:24 crc kubenswrapper[4626]: I0223 07:34:24.476091 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerStarted","Data":"5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30"} Feb 23 07:34:24 crc kubenswrapper[4626]: I0223 07:34:24.476128 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerStarted","Data":"a33f2fc236e64ccf5b5f18bd445ac45c1d850388842f8ec67a41ba1aad8f16f9"} Feb 23 07:34:25 crc kubenswrapper[4626]: I0223 07:34:25.483170 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerDied","Data":"5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30"} Feb 23 07:34:25 crc kubenswrapper[4626]: I0223 07:34:25.483992 4626 generic.go:334] "Generic (PLEG): container finished" podID="2ef3d1d4-1968-4892-8b32-36782f769992" containerID="5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30" exitCode=0 Feb 23 07:34:25 crc kubenswrapper[4626]: I0223 07:34:25.490164 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:34:26 crc kubenswrapper[4626]: I0223 07:34:26.493135 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerStarted","Data":"4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a"} Feb 23 07:34:27 crc kubenswrapper[4626]: I0223 07:34:27.513754 4626 generic.go:334] "Generic (PLEG): container finished" podID="2ef3d1d4-1968-4892-8b32-36782f769992" containerID="4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a" exitCode=0 Feb 23 07:34:27 crc kubenswrapper[4626]: I0223 07:34:27.513798 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerDied","Data":"4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a"} Feb 23 07:34:28 crc kubenswrapper[4626]: I0223 07:34:28.521263 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerStarted","Data":"45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d"} Feb 23 07:34:28 crc kubenswrapper[4626]: I0223 07:34:28.537234 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5w4r8" podStartSLOduration=3.013555225 podStartE2EDuration="5.536041573s" podCreationTimestamp="2026-02-23 07:34:23 +0000 UTC" firstStartedPulling="2026-02-23 07:34:25.484650334 +0000 UTC m=+3217.823979599" lastFinishedPulling="2026-02-23 07:34:28.007136691 +0000 UTC m=+3220.346465947" observedRunningTime="2026-02-23 07:34:28.534309467 +0000 UTC m=+3220.873638733" watchObservedRunningTime="2026-02-23 07:34:28.536041573 +0000 UTC m=+3220.875370839" Feb 23 07:34:28 crc kubenswrapper[4626]: I0223 07:34:28.982200 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:34:28 crc kubenswrapper[4626]: E0223 07:34:28.982633 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:34:33 crc kubenswrapper[4626]: I0223 07:34:33.581977 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:33 crc kubenswrapper[4626]: I0223 07:34:33.582656 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:34 crc kubenswrapper[4626]: I0223 07:34:34.620836 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5w4r8" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="registry-server" probeResult="failure" output=< Feb 23 07:34:34 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:34:34 crc kubenswrapper[4626]: > Feb 23 07:34:43 crc kubenswrapper[4626]: I0223 07:34:43.626887 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:43 crc kubenswrapper[4626]: I0223 07:34:43.666652 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:43 crc kubenswrapper[4626]: I0223 07:34:43.697442 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5w4r8"] Feb 23 07:34:43 crc kubenswrapper[4626]: I0223 07:34:43.982346 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:34:43 crc kubenswrapper[4626]: E0223 07:34:43.982582 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:34:45 crc kubenswrapper[4626]: I0223 07:34:45.631235 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5w4r8" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="registry-server" containerID="cri-o://45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d" gracePeriod=2 Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.368628 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.546962 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhjr2\" (UniqueName: \"kubernetes.io/projected/2ef3d1d4-1968-4892-8b32-36782f769992-kube-api-access-jhjr2\") pod \"2ef3d1d4-1968-4892-8b32-36782f769992\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.547265 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-catalog-content\") pod \"2ef3d1d4-1968-4892-8b32-36782f769992\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.547289 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-utilities\") pod \"2ef3d1d4-1968-4892-8b32-36782f769992\" (UID: \"2ef3d1d4-1968-4892-8b32-36782f769992\") " Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.549671 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-utilities" (OuterVolumeSpecName: "utilities") pod "2ef3d1d4-1968-4892-8b32-36782f769992" (UID: "2ef3d1d4-1968-4892-8b32-36782f769992"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.566585 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef3d1d4-1968-4892-8b32-36782f769992-kube-api-access-jhjr2" (OuterVolumeSpecName: "kube-api-access-jhjr2") pod "2ef3d1d4-1968-4892-8b32-36782f769992" (UID: "2ef3d1d4-1968-4892-8b32-36782f769992"). InnerVolumeSpecName "kube-api-access-jhjr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.639264 4626 generic.go:334] "Generic (PLEG): container finished" podID="2ef3d1d4-1968-4892-8b32-36782f769992" containerID="45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d" exitCode=0 Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.639836 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerDied","Data":"45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d"} Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.639953 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w4r8" event={"ID":"2ef3d1d4-1968-4892-8b32-36782f769992","Type":"ContainerDied","Data":"a33f2fc236e64ccf5b5f18bd445ac45c1d850388842f8ec67a41ba1aad8f16f9"} Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.640029 4626 scope.go:117] "RemoveContainer" containerID="45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.640231 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w4r8" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.647419 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ef3d1d4-1968-4892-8b32-36782f769992" (UID: "2ef3d1d4-1968-4892-8b32-36782f769992"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.655004 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhjr2\" (UniqueName: \"kubernetes.io/projected/2ef3d1d4-1968-4892-8b32-36782f769992-kube-api-access-jhjr2\") on node \"crc\" DevicePath \"\"" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.655054 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.655064 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ef3d1d4-1968-4892-8b32-36782f769992-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.678297 4626 scope.go:117] "RemoveContainer" containerID="4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.696214 4626 scope.go:117] "RemoveContainer" containerID="5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.732439 4626 scope.go:117] "RemoveContainer" containerID="45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d" Feb 23 07:34:46 crc kubenswrapper[4626]: E0223 07:34:46.734220 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d\": container with ID starting with 45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d not found: ID does not exist" containerID="45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.734969 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d"} err="failed to get container status \"45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d\": rpc error: code = NotFound desc = could not find container \"45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d\": container with ID starting with 45ed02641aa924de973e933a519659506e7b77461276eeded5182688632a070d not found: ID does not exist" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.735005 4626 scope.go:117] "RemoveContainer" containerID="4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a" Feb 23 07:34:46 crc kubenswrapper[4626]: E0223 07:34:46.737772 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a\": container with ID starting with 4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a not found: ID does not exist" containerID="4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.737830 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a"} err="failed to get container status \"4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a\": rpc error: code = NotFound desc = could not find container \"4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a\": container with ID starting with 4d8c03f3c6c6495c052d642cab15764222004be828fa727cde31a60fc8e4e61a not found: ID does not exist" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.737857 4626 scope.go:117] "RemoveContainer" containerID="5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30" Feb 23 07:34:46 crc kubenswrapper[4626]: E0223 07:34:46.738225 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30\": container with ID starting with 5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30 not found: ID does not exist" containerID="5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.738253 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30"} err="failed to get container status \"5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30\": rpc error: code = NotFound desc = could not find container \"5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30\": container with ID starting with 5f229f34cdc32f8cf21af4a3c869567ace66992eee4b9207bb769d5425279a30 not found: ID does not exist" Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.975373 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5w4r8"] Feb 23 07:34:46 crc kubenswrapper[4626]: I0223 07:34:46.985294 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5w4r8"] Feb 23 07:34:47 crc kubenswrapper[4626]: I0223 07:34:47.990462 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" path="/var/lib/kubelet/pods/2ef3d1d4-1968-4892-8b32-36782f769992/volumes" Feb 23 07:34:57 crc kubenswrapper[4626]: I0223 07:34:57.988376 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:34:57 crc kubenswrapper[4626]: E0223 07:34:57.989540 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:35:09 crc kubenswrapper[4626]: I0223 07:35:09.986088 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:35:09 crc kubenswrapper[4626]: E0223 07:35:09.986955 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:35:22 crc kubenswrapper[4626]: I0223 07:35:22.982143 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:35:22 crc kubenswrapper[4626]: E0223 07:35:22.982814 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:35:36 crc kubenswrapper[4626]: I0223 07:35:36.982116 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:35:36 crc kubenswrapper[4626]: E0223 07:35:36.982648 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:35:49 crc kubenswrapper[4626]: I0223 07:35:49.982664 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:35:49 crc kubenswrapper[4626]: E0223 07:35:49.983230 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:36:03 crc kubenswrapper[4626]: I0223 07:36:03.982472 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:36:03 crc kubenswrapper[4626]: E0223 07:36:03.983162 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:36:18 crc kubenswrapper[4626]: I0223 07:36:18.981471 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:36:18 crc kubenswrapper[4626]: E0223 07:36:18.982053 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:36:30 crc kubenswrapper[4626]: I0223 07:36:30.981949 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:36:31 crc kubenswrapper[4626]: I0223 07:36:31.355220 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"5fa0f336bfdf7b911090bf6cb811863ef59de648375d06011c6d727a9ef66f0c"} Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.587677 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4lzzj"] Feb 23 07:38:21 crc kubenswrapper[4626]: E0223 07:38:21.590554 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="extract-utilities" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.590583 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="extract-utilities" Feb 23 07:38:21 crc kubenswrapper[4626]: E0223 07:38:21.590611 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="extract-content" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.590618 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="extract-content" Feb 23 07:38:21 crc kubenswrapper[4626]: E0223 07:38:21.590639 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="registry-server" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.590644 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="registry-server" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.591741 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef3d1d4-1968-4892-8b32-36782f769992" containerName="registry-server" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.595569 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.601024 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-catalog-content\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.601353 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-utilities\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.601397 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntzb\" (UniqueName: \"kubernetes.io/projected/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-kube-api-access-sntzb\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.702441 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-utilities\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.702489 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntzb\" (UniqueName: \"kubernetes.io/projected/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-kube-api-access-sntzb\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.702576 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-catalog-content\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.704619 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-utilities\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.705480 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-catalog-content\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.785076 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lzzj"] Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.801765 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntzb\" (UniqueName: \"kubernetes.io/projected/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-kube-api-access-sntzb\") pod \"redhat-marketplace-4lzzj\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:21 crc kubenswrapper[4626]: I0223 07:38:21.919023 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:22 crc kubenswrapper[4626]: I0223 07:38:22.762230 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lzzj"] Feb 23 07:38:23 crc kubenswrapper[4626]: I0223 07:38:23.228192 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerDied","Data":"bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d"} Feb 23 07:38:23 crc kubenswrapper[4626]: I0223 07:38:23.228443 4626 generic.go:334] "Generic (PLEG): container finished" podID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerID="bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d" exitCode=0 Feb 23 07:38:23 crc kubenswrapper[4626]: I0223 07:38:23.229111 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerStarted","Data":"8f01ad820a11acfb62303e5e14041ddb5235b805622b5b35fe415201985c420f"} Feb 23 07:38:24 crc kubenswrapper[4626]: I0223 07:38:24.240749 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerStarted","Data":"c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564"} Feb 23 07:38:25 crc kubenswrapper[4626]: I0223 07:38:25.250646 4626 generic.go:334] "Generic (PLEG): container finished" podID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerID="c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564" exitCode=0 Feb 23 07:38:25 crc kubenswrapper[4626]: I0223 07:38:25.250753 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerDied","Data":"c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564"} Feb 23 07:38:26 crc kubenswrapper[4626]: I0223 07:38:26.263240 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerStarted","Data":"563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b"} Feb 23 07:38:26 crc kubenswrapper[4626]: I0223 07:38:26.289531 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4lzzj" podStartSLOduration=2.699758099 podStartE2EDuration="5.288210608s" podCreationTimestamp="2026-02-23 07:38:21 +0000 UTC" firstStartedPulling="2026-02-23 07:38:23.231809871 +0000 UTC m=+3455.571139137" lastFinishedPulling="2026-02-23 07:38:25.82026238 +0000 UTC m=+3458.159591646" observedRunningTime="2026-02-23 07:38:26.278893793 +0000 UTC m=+3458.618223058" watchObservedRunningTime="2026-02-23 07:38:26.288210608 +0000 UTC m=+3458.627539874" Feb 23 07:38:31 crc kubenswrapper[4626]: I0223 07:38:31.920120 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:31 crc kubenswrapper[4626]: I0223 07:38:31.920703 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:31 crc kubenswrapper[4626]: I0223 07:38:31.968316 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:32 crc kubenswrapper[4626]: I0223 07:38:32.359545 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:32 crc kubenswrapper[4626]: I0223 07:38:32.437175 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lzzj"] Feb 23 07:38:34 crc kubenswrapper[4626]: I0223 07:38:34.331907 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4lzzj" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="registry-server" containerID="cri-o://563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b" gracePeriod=2 Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.169234 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.184790 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-catalog-content\") pod \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.184919 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-utilities\") pod \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.185005 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sntzb\" (UniqueName: \"kubernetes.io/projected/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-kube-api-access-sntzb\") pod \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\" (UID: \"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2\") " Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.187107 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-utilities" (OuterVolumeSpecName: "utilities") pod "3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" (UID: "3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.203353 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-kube-api-access-sntzb" (OuterVolumeSpecName: "kube-api-access-sntzb") pod "3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" (UID: "3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2"). InnerVolumeSpecName "kube-api-access-sntzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.215350 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" (UID: "3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.288686 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.288714 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.288725 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sntzb\" (UniqueName: \"kubernetes.io/projected/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2-kube-api-access-sntzb\") on node \"crc\" DevicePath \"\"" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.339607 4626 generic.go:334] "Generic (PLEG): container finished" podID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerID="563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b" exitCode=0 Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.339652 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerDied","Data":"563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b"} Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.339677 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4lzzj" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.339702 4626 scope.go:117] "RemoveContainer" containerID="563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.339684 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4lzzj" event={"ID":"3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2","Type":"ContainerDied","Data":"8f01ad820a11acfb62303e5e14041ddb5235b805622b5b35fe415201985c420f"} Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.374847 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lzzj"] Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.378123 4626 scope.go:117] "RemoveContainer" containerID="c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.381850 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4lzzj"] Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.403028 4626 scope.go:117] "RemoveContainer" containerID="bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.441691 4626 scope.go:117] "RemoveContainer" containerID="563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b" Feb 23 07:38:35 crc kubenswrapper[4626]: E0223 07:38:35.445318 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b\": container with ID starting with 563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b not found: ID does not exist" containerID="563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.445358 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b"} err="failed to get container status \"563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b\": rpc error: code = NotFound desc = could not find container \"563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b\": container with ID starting with 563aafb5eec8892992c32a58e3e6d6b75912c8305cccf432d5d9dcbdcbf5a74b not found: ID does not exist" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.445380 4626 scope.go:117] "RemoveContainer" containerID="c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564" Feb 23 07:38:35 crc kubenswrapper[4626]: E0223 07:38:35.445849 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564\": container with ID starting with c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564 not found: ID does not exist" containerID="c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.445892 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564"} err="failed to get container status \"c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564\": rpc error: code = NotFound desc = could not find container \"c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564\": container with ID starting with c3066c6aeb48113f36f8fd5e47f940f7e178b6b1ffca485b199fe72ffc2e5564 not found: ID does not exist" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.445907 4626 scope.go:117] "RemoveContainer" containerID="bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d" Feb 23 07:38:35 crc kubenswrapper[4626]: E0223 07:38:35.446227 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d\": container with ID starting with bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d not found: ID does not exist" containerID="bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.446270 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d"} err="failed to get container status \"bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d\": rpc error: code = NotFound desc = could not find container \"bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d\": container with ID starting with bf307a628b91d3480ea10e288abc637bad15cc40db206d6c2098e6bf4769a22d not found: ID does not exist" Feb 23 07:38:35 crc kubenswrapper[4626]: I0223 07:38:35.993112 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" path="/var/lib/kubelet/pods/3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2/volumes" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.375662 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xk9s2"] Feb 23 07:38:43 crc kubenswrapper[4626]: E0223 07:38:43.377458 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="extract-content" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.377485 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="extract-content" Feb 23 07:38:43 crc kubenswrapper[4626]: E0223 07:38:43.377544 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="extract-utilities" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.377551 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="extract-utilities" Feb 23 07:38:43 crc kubenswrapper[4626]: E0223 07:38:43.377582 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="registry-server" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.377589 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="registry-server" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.378707 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef7eb1b-e7b5-4be5-bd9b-9d86d76736a2" containerName="registry-server" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.382793 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.397580 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xk9s2"] Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.453376 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-utilities\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.453688 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-catalog-content\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.453819 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw9rl\" (UniqueName: \"kubernetes.io/projected/ab3e6766-b00f-4aaa-bcc3-64e971114da9-kube-api-access-lw9rl\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.555833 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw9rl\" (UniqueName: \"kubernetes.io/projected/ab3e6766-b00f-4aaa-bcc3-64e971114da9-kube-api-access-lw9rl\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.556287 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-utilities\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.556475 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-catalog-content\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.557909 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-utilities\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.557938 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-catalog-content\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.576820 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw9rl\" (UniqueName: \"kubernetes.io/projected/ab3e6766-b00f-4aaa-bcc3-64e971114da9-kube-api-access-lw9rl\") pod \"redhat-operators-xk9s2\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:43 crc kubenswrapper[4626]: I0223 07:38:43.702796 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:44 crc kubenswrapper[4626]: I0223 07:38:44.203530 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xk9s2"] Feb 23 07:38:44 crc kubenswrapper[4626]: I0223 07:38:44.411921 4626 generic.go:334] "Generic (PLEG): container finished" podID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerID="7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8" exitCode=0 Feb 23 07:38:44 crc kubenswrapper[4626]: I0223 07:38:44.411957 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerDied","Data":"7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8"} Feb 23 07:38:44 crc kubenswrapper[4626]: I0223 07:38:44.411979 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerStarted","Data":"97693914fbdf0fff91a7f82289bd1c75f96a7187a6bf314d4633b4cbed1488b8"} Feb 23 07:38:45 crc kubenswrapper[4626]: I0223 07:38:45.446243 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerStarted","Data":"cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4"} Feb 23 07:38:48 crc kubenswrapper[4626]: I0223 07:38:48.467266 4626 generic.go:334] "Generic (PLEG): container finished" podID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerID="cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4" exitCode=0 Feb 23 07:38:48 crc kubenswrapper[4626]: I0223 07:38:48.467361 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerDied","Data":"cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4"} Feb 23 07:38:49 crc kubenswrapper[4626]: I0223 07:38:49.476223 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerStarted","Data":"7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b"} Feb 23 07:38:49 crc kubenswrapper[4626]: I0223 07:38:49.494687 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xk9s2" podStartSLOduration=1.8692627800000001 podStartE2EDuration="6.492596263s" podCreationTimestamp="2026-02-23 07:38:43 +0000 UTC" firstStartedPulling="2026-02-23 07:38:44.413878781 +0000 UTC m=+3476.753208047" lastFinishedPulling="2026-02-23 07:38:49.037212264 +0000 UTC m=+3481.376541530" observedRunningTime="2026-02-23 07:38:49.488459013 +0000 UTC m=+3481.827788280" watchObservedRunningTime="2026-02-23 07:38:49.492596263 +0000 UTC m=+3481.831925530" Feb 23 07:38:53 crc kubenswrapper[4626]: I0223 07:38:53.713790 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:53 crc kubenswrapper[4626]: I0223 07:38:53.714205 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:38:54 crc kubenswrapper[4626]: I0223 07:38:54.749982 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xk9s2" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" probeResult="failure" output=< Feb 23 07:38:54 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:38:54 crc kubenswrapper[4626]: > Feb 23 07:38:55 crc kubenswrapper[4626]: I0223 07:38:55.685221 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:38:55 crc kubenswrapper[4626]: I0223 07:38:55.686849 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:39:04 crc kubenswrapper[4626]: I0223 07:39:04.739677 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xk9s2" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" probeResult="failure" output=< Feb 23 07:39:04 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:39:04 crc kubenswrapper[4626]: > Feb 23 07:39:14 crc kubenswrapper[4626]: I0223 07:39:14.739964 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xk9s2" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" probeResult="failure" output=< Feb 23 07:39:14 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:39:14 crc kubenswrapper[4626]: > Feb 23 07:39:17 crc kubenswrapper[4626]: E0223 07:39:17.805816 4626 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.58:37074->192.168.26.58:46805: write tcp 192.168.26.58:37074->192.168.26.58:46805: write: connection reset by peer Feb 23 07:39:23 crc kubenswrapper[4626]: I0223 07:39:23.747306 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:39:23 crc kubenswrapper[4626]: I0223 07:39:23.791035 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:39:24 crc kubenswrapper[4626]: I0223 07:39:24.073989 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xk9s2"] Feb 23 07:39:25 crc kubenswrapper[4626]: I0223 07:39:25.685066 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:39:25 crc kubenswrapper[4626]: I0223 07:39:25.685696 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:39:25 crc kubenswrapper[4626]: I0223 07:39:25.729558 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xk9s2" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" containerID="cri-o://7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b" gracePeriod=2 Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.496140 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.608038 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw9rl\" (UniqueName: \"kubernetes.io/projected/ab3e6766-b00f-4aaa-bcc3-64e971114da9-kube-api-access-lw9rl\") pod \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.608125 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-catalog-content\") pod \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.608260 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-utilities\") pod \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\" (UID: \"ab3e6766-b00f-4aaa-bcc3-64e971114da9\") " Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.610467 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-utilities" (OuterVolumeSpecName: "utilities") pod "ab3e6766-b00f-4aaa-bcc3-64e971114da9" (UID: "ab3e6766-b00f-4aaa-bcc3-64e971114da9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.620200 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3e6766-b00f-4aaa-bcc3-64e971114da9-kube-api-access-lw9rl" (OuterVolumeSpecName: "kube-api-access-lw9rl") pod "ab3e6766-b00f-4aaa-bcc3-64e971114da9" (UID: "ab3e6766-b00f-4aaa-bcc3-64e971114da9"). InnerVolumeSpecName "kube-api-access-lw9rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.710800 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.710827 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw9rl\" (UniqueName: \"kubernetes.io/projected/ab3e6766-b00f-4aaa-bcc3-64e971114da9-kube-api-access-lw9rl\") on node \"crc\" DevicePath \"\"" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.729225 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab3e6766-b00f-4aaa-bcc3-64e971114da9" (UID: "ab3e6766-b00f-4aaa-bcc3-64e971114da9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.737336 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xk9s2" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.737414 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerDied","Data":"7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b"} Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.737478 4626 scope.go:117] "RemoveContainer" containerID="7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.737903 4626 generic.go:334] "Generic (PLEG): container finished" podID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerID="7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b" exitCode=0 Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.738125 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xk9s2" event={"ID":"ab3e6766-b00f-4aaa-bcc3-64e971114da9","Type":"ContainerDied","Data":"97693914fbdf0fff91a7f82289bd1c75f96a7187a6bf314d4633b4cbed1488b8"} Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.774779 4626 scope.go:117] "RemoveContainer" containerID="cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.775459 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xk9s2"] Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.796399 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xk9s2"] Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.812716 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab3e6766-b00f-4aaa-bcc3-64e971114da9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.841880 4626 scope.go:117] "RemoveContainer" containerID="7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.875223 4626 scope.go:117] "RemoveContainer" containerID="7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b" Feb 23 07:39:26 crc kubenswrapper[4626]: E0223 07:39:26.876924 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b\": container with ID starting with 7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b not found: ID does not exist" containerID="7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.876961 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b"} err="failed to get container status \"7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b\": rpc error: code = NotFound desc = could not find container \"7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b\": container with ID starting with 7027f9c01873f0b45720a3b0fee58cce21825d02cea72c743476fa209d03f64b not found: ID does not exist" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.876982 4626 scope.go:117] "RemoveContainer" containerID="cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4" Feb 23 07:39:26 crc kubenswrapper[4626]: E0223 07:39:26.877289 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4\": container with ID starting with cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4 not found: ID does not exist" containerID="cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.877317 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4"} err="failed to get container status \"cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4\": rpc error: code = NotFound desc = could not find container \"cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4\": container with ID starting with cc6a2524220584d8011ee8aabd3370abebe5b8bb3ed7751e8a9e2dd78e6c6cb4 not found: ID does not exist" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.877332 4626 scope.go:117] "RemoveContainer" containerID="7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8" Feb 23 07:39:26 crc kubenswrapper[4626]: E0223 07:39:26.878622 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8\": container with ID starting with 7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8 not found: ID does not exist" containerID="7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8" Feb 23 07:39:26 crc kubenswrapper[4626]: I0223 07:39:26.878816 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8"} err="failed to get container status \"7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8\": rpc error: code = NotFound desc = could not find container \"7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8\": container with ID starting with 7d87c52c54c68b9f1feec5582065b294ad6cbd040f429428095bb1896c301bc8 not found: ID does not exist" Feb 23 07:39:27 crc kubenswrapper[4626]: I0223 07:39:27.993601 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" path="/var/lib/kubelet/pods/ab3e6766-b00f-4aaa-bcc3-64e971114da9/volumes" Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.687280 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.689244 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.690800 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.692925 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5fa0f336bfdf7b911090bf6cb811863ef59de648375d06011c6d727a9ef66f0c"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.693018 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://5fa0f336bfdf7b911090bf6cb811863ef59de648375d06011c6d727a9ef66f0c" gracePeriod=600 Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.990671 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"5fa0f336bfdf7b911090bf6cb811863ef59de648375d06011c6d727a9ef66f0c"} Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.990949 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="5fa0f336bfdf7b911090bf6cb811863ef59de648375d06011c6d727a9ef66f0c" exitCode=0 Feb 23 07:39:55 crc kubenswrapper[4626]: I0223 07:39:55.991746 4626 scope.go:117] "RemoveContainer" containerID="862fad463c645c501a8bbe65b7dbc0d893dd12d5516bd78de6697864ca80cdac" Feb 23 07:39:57 crc kubenswrapper[4626]: I0223 07:39:56.999741 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f"} Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.054305 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-57b9s"] Feb 23 07:40:56 crc kubenswrapper[4626]: E0223 07:40:56.056405 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="extract-content" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.056959 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="extract-content" Feb 23 07:40:56 crc kubenswrapper[4626]: E0223 07:40:56.056986 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.056994 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" Feb 23 07:40:56 crc kubenswrapper[4626]: E0223 07:40:56.057004 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="extract-utilities" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.057010 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="extract-utilities" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.057880 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3e6766-b00f-4aaa-bcc3-64e971114da9" containerName="registry-server" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.060899 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.131586 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-catalog-content\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.131872 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4jh\" (UniqueName: \"kubernetes.io/projected/96f88aa6-60c5-440e-bb86-26a0b274ed07-kube-api-access-xx4jh\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.132075 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-utilities\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.176414 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57b9s"] Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.233949 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4jh\" (UniqueName: \"kubernetes.io/projected/96f88aa6-60c5-440e-bb86-26a0b274ed07-kube-api-access-xx4jh\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.234061 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-utilities\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.234275 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-catalog-content\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.237901 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-utilities\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.238424 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-catalog-content\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.266675 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4jh\" (UniqueName: \"kubernetes.io/projected/96f88aa6-60c5-440e-bb86-26a0b274ed07-kube-api-access-xx4jh\") pod \"community-operators-57b9s\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:56 crc kubenswrapper[4626]: I0223 07:40:56.381729 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:40:57 crc kubenswrapper[4626]: I0223 07:40:57.141422 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57b9s"] Feb 23 07:40:57 crc kubenswrapper[4626]: I0223 07:40:57.510691 4626 generic.go:334] "Generic (PLEG): container finished" podID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerID="8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3" exitCode=0 Feb 23 07:40:57 crc kubenswrapper[4626]: I0223 07:40:57.510753 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerDied","Data":"8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3"} Feb 23 07:40:57 crc kubenswrapper[4626]: I0223 07:40:57.511029 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerStarted","Data":"27456e29a513a9b428e4dc7c8b9a2f27025ca36cb4f4aecf04d327c8eee9d4cb"} Feb 23 07:40:57 crc kubenswrapper[4626]: I0223 07:40:57.514488 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:40:58 crc kubenswrapper[4626]: I0223 07:40:58.520839 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerStarted","Data":"2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05"} Feb 23 07:40:59 crc kubenswrapper[4626]: I0223 07:40:59.530936 4626 generic.go:334] "Generic (PLEG): container finished" podID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerID="2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05" exitCode=0 Feb 23 07:40:59 crc kubenswrapper[4626]: I0223 07:40:59.530976 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerDied","Data":"2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05"} Feb 23 07:41:00 crc kubenswrapper[4626]: I0223 07:41:00.544668 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerStarted","Data":"d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289"} Feb 23 07:41:00 crc kubenswrapper[4626]: I0223 07:41:00.569698 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-57b9s" podStartSLOduration=3.040638786 podStartE2EDuration="5.569123119s" podCreationTimestamp="2026-02-23 07:40:55 +0000 UTC" firstStartedPulling="2026-02-23 07:40:57.512569613 +0000 UTC m=+3609.851898879" lastFinishedPulling="2026-02-23 07:41:00.041053946 +0000 UTC m=+3612.380383212" observedRunningTime="2026-02-23 07:41:00.561021994 +0000 UTC m=+3612.900351260" watchObservedRunningTime="2026-02-23 07:41:00.569123119 +0000 UTC m=+3612.908452384" Feb 23 07:41:06 crc kubenswrapper[4626]: I0223 07:41:06.382698 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:41:06 crc kubenswrapper[4626]: I0223 07:41:06.383078 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:41:06 crc kubenswrapper[4626]: I0223 07:41:06.424033 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:41:06 crc kubenswrapper[4626]: I0223 07:41:06.651628 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:41:07 crc kubenswrapper[4626]: I0223 07:41:07.114272 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57b9s"] Feb 23 07:41:08 crc kubenswrapper[4626]: I0223 07:41:08.633647 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-57b9s" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="registry-server" containerID="cri-o://d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289" gracePeriod=2 Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.313260 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.489069 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-utilities\") pod \"96f88aa6-60c5-440e-bb86-26a0b274ed07\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.489209 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx4jh\" (UniqueName: \"kubernetes.io/projected/96f88aa6-60c5-440e-bb86-26a0b274ed07-kube-api-access-xx4jh\") pod \"96f88aa6-60c5-440e-bb86-26a0b274ed07\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.489249 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-catalog-content\") pod \"96f88aa6-60c5-440e-bb86-26a0b274ed07\" (UID: \"96f88aa6-60c5-440e-bb86-26a0b274ed07\") " Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.491724 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-utilities" (OuterVolumeSpecName: "utilities") pod "96f88aa6-60c5-440e-bb86-26a0b274ed07" (UID: "96f88aa6-60c5-440e-bb86-26a0b274ed07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.512846 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f88aa6-60c5-440e-bb86-26a0b274ed07-kube-api-access-xx4jh" (OuterVolumeSpecName: "kube-api-access-xx4jh") pod "96f88aa6-60c5-440e-bb86-26a0b274ed07" (UID: "96f88aa6-60c5-440e-bb86-26a0b274ed07"). InnerVolumeSpecName "kube-api-access-xx4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.579825 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96f88aa6-60c5-440e-bb86-26a0b274ed07" (UID: "96f88aa6-60c5-440e-bb86-26a0b274ed07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.592362 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx4jh\" (UniqueName: \"kubernetes.io/projected/96f88aa6-60c5-440e-bb86-26a0b274ed07-kube-api-access-xx4jh\") on node \"crc\" DevicePath \"\"" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.592413 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.592426 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f88aa6-60c5-440e-bb86-26a0b274ed07-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.649242 4626 generic.go:334] "Generic (PLEG): container finished" podID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerID="d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289" exitCode=0 Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.649296 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerDied","Data":"d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289"} Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.649339 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57b9s" event={"ID":"96f88aa6-60c5-440e-bb86-26a0b274ed07","Type":"ContainerDied","Data":"27456e29a513a9b428e4dc7c8b9a2f27025ca36cb4f4aecf04d327c8eee9d4cb"} Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.649370 4626 scope.go:117] "RemoveContainer" containerID="d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.649386 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57b9s" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.720040 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57b9s"] Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.726782 4626 scope.go:117] "RemoveContainer" containerID="2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.730724 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-57b9s"] Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.759322 4626 scope.go:117] "RemoveContainer" containerID="8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.817651 4626 scope.go:117] "RemoveContainer" containerID="d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289" Feb 23 07:41:09 crc kubenswrapper[4626]: E0223 07:41:09.858625 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289\": container with ID starting with d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289 not found: ID does not exist" containerID="d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.858676 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289"} err="failed to get container status \"d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289\": rpc error: code = NotFound desc = could not find container \"d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289\": container with ID starting with d08fee3e58ea80837a22790e40688a66f26a57ab5d06309990f0ec5d48723289 not found: ID does not exist" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.858714 4626 scope.go:117] "RemoveContainer" containerID="2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05" Feb 23 07:41:09 crc kubenswrapper[4626]: E0223 07:41:09.863078 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05\": container with ID starting with 2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05 not found: ID does not exist" containerID="2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.863113 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05"} err="failed to get container status \"2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05\": rpc error: code = NotFound desc = could not find container \"2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05\": container with ID starting with 2eb79fcf84900ba7c898e0bea2955901a3c874b460798dcac8ea21f78a67ca05 not found: ID does not exist" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.863140 4626 scope.go:117] "RemoveContainer" containerID="8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3" Feb 23 07:41:09 crc kubenswrapper[4626]: E0223 07:41:09.863592 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3\": container with ID starting with 8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3 not found: ID does not exist" containerID="8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.863613 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3"} err="failed to get container status \"8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3\": rpc error: code = NotFound desc = could not find container \"8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3\": container with ID starting with 8a130dfad38b4c2b0266281248e110edd002911479dc201547b1692c21dcbeb3 not found: ID does not exist" Feb 23 07:41:09 crc kubenswrapper[4626]: I0223 07:41:09.996981 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" path="/var/lib/kubelet/pods/96f88aa6-60c5-440e-bb86-26a0b274ed07/volumes" Feb 23 07:41:55 crc kubenswrapper[4626]: I0223 07:41:55.686525 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:41:55 crc kubenswrapper[4626]: I0223 07:41:55.688884 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:42:25 crc kubenswrapper[4626]: I0223 07:42:25.684919 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:42:25 crc kubenswrapper[4626]: I0223 07:42:25.685408 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:42:55 crc kubenswrapper[4626]: I0223 07:42:55.685270 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:42:55 crc kubenswrapper[4626]: I0223 07:42:55.685738 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:42:55 crc kubenswrapper[4626]: I0223 07:42:55.686417 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:42:55 crc kubenswrapper[4626]: I0223 07:42:55.688136 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:42:55 crc kubenswrapper[4626]: I0223 07:42:55.688757 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" gracePeriod=600 Feb 23 07:42:55 crc kubenswrapper[4626]: E0223 07:42:55.817933 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:42:56 crc kubenswrapper[4626]: I0223 07:42:56.463993 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f"} Feb 23 07:42:56 crc kubenswrapper[4626]: I0223 07:42:56.464069 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" exitCode=0 Feb 23 07:42:56 crc kubenswrapper[4626]: I0223 07:42:56.466810 4626 scope.go:117] "RemoveContainer" containerID="5fa0f336bfdf7b911090bf6cb811863ef59de648375d06011c6d727a9ef66f0c" Feb 23 07:42:56 crc kubenswrapper[4626]: I0223 07:42:56.467558 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:42:56 crc kubenswrapper[4626]: E0223 07:42:56.467976 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:43:08 crc kubenswrapper[4626]: I0223 07:43:08.981859 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:43:08 crc kubenswrapper[4626]: E0223 07:43:08.982749 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:43:20 crc kubenswrapper[4626]: I0223 07:43:20.982543 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:43:20 crc kubenswrapper[4626]: E0223 07:43:20.983058 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:43:31 crc kubenswrapper[4626]: I0223 07:43:31.982438 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:43:31 crc kubenswrapper[4626]: E0223 07:43:31.983326 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:43:43 crc kubenswrapper[4626]: I0223 07:43:43.982184 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:43:43 crc kubenswrapper[4626]: E0223 07:43:43.983217 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:43:55 crc kubenswrapper[4626]: I0223 07:43:55.982638 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:43:55 crc kubenswrapper[4626]: E0223 07:43:55.983301 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:44:07 crc kubenswrapper[4626]: I0223 07:44:07.986996 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:44:07 crc kubenswrapper[4626]: E0223 07:44:07.987567 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:44:21 crc kubenswrapper[4626]: I0223 07:44:21.982263 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:44:21 crc kubenswrapper[4626]: E0223 07:44:21.983169 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:44:36 crc kubenswrapper[4626]: I0223 07:44:36.982755 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:44:36 crc kubenswrapper[4626]: E0223 07:44:36.983417 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:44:49 crc kubenswrapper[4626]: I0223 07:44:49.982306 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:44:49 crc kubenswrapper[4626]: E0223 07:44:49.983144 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.335412 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hrhcw"] Feb 23 07:44:51 crc kubenswrapper[4626]: E0223 07:44:51.339985 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="extract-utilities" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.340033 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="extract-utilities" Feb 23 07:44:51 crc kubenswrapper[4626]: E0223 07:44:51.340042 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="extract-content" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.340049 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="extract-content" Feb 23 07:44:51 crc kubenswrapper[4626]: E0223 07:44:51.340061 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="registry-server" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.340066 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="registry-server" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.340692 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f88aa6-60c5-440e-bb86-26a0b274ed07" containerName="registry-server" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.362815 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.386911 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrhcw"] Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.474880 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-utilities\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.474974 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxk4\" (UniqueName: \"kubernetes.io/projected/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-kube-api-access-scxk4\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.475067 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-catalog-content\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.576743 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-utilities\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.576823 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxk4\" (UniqueName: \"kubernetes.io/projected/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-kube-api-access-scxk4\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.576910 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-catalog-content\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.580010 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-utilities\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.580233 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-catalog-content\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.602617 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxk4\" (UniqueName: \"kubernetes.io/projected/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-kube-api-access-scxk4\") pod \"certified-operators-hrhcw\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:51 crc kubenswrapper[4626]: I0223 07:44:51.704597 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:44:52 crc kubenswrapper[4626]: I0223 07:44:52.425373 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrhcw"] Feb 23 07:44:53 crc kubenswrapper[4626]: I0223 07:44:53.276701 4626 generic.go:334] "Generic (PLEG): container finished" podID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerID="1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1" exitCode=0 Feb 23 07:44:53 crc kubenswrapper[4626]: I0223 07:44:53.276870 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerDied","Data":"1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1"} Feb 23 07:44:53 crc kubenswrapper[4626]: I0223 07:44:53.277021 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerStarted","Data":"4ff4e2008ca1c0eb5d7b6732ee999de52a84c3d11a830d7dc3deb602fb9bdb6a"} Feb 23 07:44:54 crc kubenswrapper[4626]: I0223 07:44:54.284104 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerStarted","Data":"33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa"} Feb 23 07:44:55 crc kubenswrapper[4626]: I0223 07:44:55.291737 4626 generic.go:334] "Generic (PLEG): container finished" podID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerID="33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa" exitCode=0 Feb 23 07:44:55 crc kubenswrapper[4626]: I0223 07:44:55.291846 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerDied","Data":"33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa"} Feb 23 07:44:56 crc kubenswrapper[4626]: I0223 07:44:56.301700 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerStarted","Data":"f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7"} Feb 23 07:44:56 crc kubenswrapper[4626]: I0223 07:44:56.319432 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hrhcw" podStartSLOduration=2.824818054 podStartE2EDuration="5.318068756s" podCreationTimestamp="2026-02-23 07:44:51 +0000 UTC" firstStartedPulling="2026-02-23 07:44:53.278519491 +0000 UTC m=+3845.617848757" lastFinishedPulling="2026-02-23 07:44:55.771770194 +0000 UTC m=+3848.111099459" observedRunningTime="2026-02-23 07:44:56.315878587 +0000 UTC m=+3848.655207852" watchObservedRunningTime="2026-02-23 07:44:56.318068756 +0000 UTC m=+3848.657398022" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.209621 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m"] Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.214401 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m"] Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.214546 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.226093 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.238726 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd82j\" (UniqueName: \"kubernetes.io/projected/b5a56899-cb3c-4da0-bcb1-af450262d173-kube-api-access-pd82j\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.238914 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a56899-cb3c-4da0-bcb1-af450262d173-config-volume\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.239476 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a56899-cb3c-4da0-bcb1-af450262d173-secret-volume\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.264423 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.341639 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a56899-cb3c-4da0-bcb1-af450262d173-secret-volume\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.341834 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd82j\" (UniqueName: \"kubernetes.io/projected/b5a56899-cb3c-4da0-bcb1-af450262d173-kube-api-access-pd82j\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.341980 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a56899-cb3c-4da0-bcb1-af450262d173-config-volume\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.343028 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a56899-cb3c-4da0-bcb1-af450262d173-config-volume\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.352442 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a56899-cb3c-4da0-bcb1-af450262d173-secret-volume\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.356980 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd82j\" (UniqueName: \"kubernetes.io/projected/b5a56899-cb3c-4da0-bcb1-af450262d173-kube-api-access-pd82j\") pod \"collect-profiles-29530545-mp84m\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.540698 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:00 crc kubenswrapper[4626]: I0223 07:45:00.983144 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:45:00 crc kubenswrapper[4626]: E0223 07:45:00.983684 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.023035 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m"] Feb 23 07:45:01 crc kubenswrapper[4626]: W0223 07:45:01.033769 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a56899_cb3c_4da0_bcb1_af450262d173.slice/crio-8f8df50428fad60f48d2f24d395a2078204f4aeec62274da83d65a53ab345158 WatchSource:0}: Error finding container 8f8df50428fad60f48d2f24d395a2078204f4aeec62274da83d65a53ab345158: Status 404 returned error can't find the container with id 8f8df50428fad60f48d2f24d395a2078204f4aeec62274da83d65a53ab345158 Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.344880 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" event={"ID":"b5a56899-cb3c-4da0-bcb1-af450262d173","Type":"ContainerStarted","Data":"020947f1a4c59d9e51ead17b7194047dd16164f91dc7931793256c93002e38ba"} Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.345312 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" event={"ID":"b5a56899-cb3c-4da0-bcb1-af450262d173","Type":"ContainerStarted","Data":"8f8df50428fad60f48d2f24d395a2078204f4aeec62274da83d65a53ab345158"} Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.363118 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" podStartSLOduration=1.363100325 podStartE2EDuration="1.363100325s" podCreationTimestamp="2026-02-23 07:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:45:01.36110865 +0000 UTC m=+3853.700437916" watchObservedRunningTime="2026-02-23 07:45:01.363100325 +0000 UTC m=+3853.702429591" Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.705410 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.705458 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:45:01 crc kubenswrapper[4626]: I0223 07:45:01.744177 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:45:02 crc kubenswrapper[4626]: I0223 07:45:02.354152 4626 generic.go:334] "Generic (PLEG): container finished" podID="b5a56899-cb3c-4da0-bcb1-af450262d173" containerID="020947f1a4c59d9e51ead17b7194047dd16164f91dc7931793256c93002e38ba" exitCode=0 Feb 23 07:45:02 crc kubenswrapper[4626]: I0223 07:45:02.354203 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" event={"ID":"b5a56899-cb3c-4da0-bcb1-af450262d173","Type":"ContainerDied","Data":"020947f1a4c59d9e51ead17b7194047dd16164f91dc7931793256c93002e38ba"} Feb 23 07:45:02 crc kubenswrapper[4626]: I0223 07:45:02.396916 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:45:02 crc kubenswrapper[4626]: I0223 07:45:02.443816 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrhcw"] Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.817945 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.915175 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd82j\" (UniqueName: \"kubernetes.io/projected/b5a56899-cb3c-4da0-bcb1-af450262d173-kube-api-access-pd82j\") pod \"b5a56899-cb3c-4da0-bcb1-af450262d173\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.915232 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a56899-cb3c-4da0-bcb1-af450262d173-config-volume\") pod \"b5a56899-cb3c-4da0-bcb1-af450262d173\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.915279 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a56899-cb3c-4da0-bcb1-af450262d173-secret-volume\") pod \"b5a56899-cb3c-4da0-bcb1-af450262d173\" (UID: \"b5a56899-cb3c-4da0-bcb1-af450262d173\") " Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.917113 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5a56899-cb3c-4da0-bcb1-af450262d173-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5a56899-cb3c-4da0-bcb1-af450262d173" (UID: "b5a56899-cb3c-4da0-bcb1-af450262d173"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.924766 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a56899-cb3c-4da0-bcb1-af450262d173-kube-api-access-pd82j" (OuterVolumeSpecName: "kube-api-access-pd82j") pod "b5a56899-cb3c-4da0-bcb1-af450262d173" (UID: "b5a56899-cb3c-4da0-bcb1-af450262d173"). InnerVolumeSpecName "kube-api-access-pd82j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[4626]: I0223 07:45:03.928986 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a56899-cb3c-4da0-bcb1-af450262d173-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5a56899-cb3c-4da0-bcb1-af450262d173" (UID: "b5a56899-cb3c-4da0-bcb1-af450262d173"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.018249 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd82j\" (UniqueName: \"kubernetes.io/projected/b5a56899-cb3c-4da0-bcb1-af450262d173-kube-api-access-pd82j\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.018283 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5a56899-cb3c-4da0-bcb1-af450262d173-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.018302 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5a56899-cb3c-4da0-bcb1-af450262d173-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.374683 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" event={"ID":"b5a56899-cb3c-4da0-bcb1-af450262d173","Type":"ContainerDied","Data":"8f8df50428fad60f48d2f24d395a2078204f4aeec62274da83d65a53ab345158"} Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.375036 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f8df50428fad60f48d2f24d395a2078204f4aeec62274da83d65a53ab345158" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.375234 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.375229 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hrhcw" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="registry-server" containerID="cri-o://f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7" gracePeriod=2 Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.459151 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf"] Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.466579 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-w6dnf"] Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.821822 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.938834 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-catalog-content\") pod \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.939372 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-utilities\") pod \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.939603 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scxk4\" (UniqueName: \"kubernetes.io/projected/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-kube-api-access-scxk4\") pod \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\" (UID: \"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8\") " Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.940308 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-utilities" (OuterVolumeSpecName: "utilities") pod "97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" (UID: "97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.941757 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:04 crc kubenswrapper[4626]: I0223 07:45:04.945059 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-kube-api-access-scxk4" (OuterVolumeSpecName: "kube-api-access-scxk4") pod "97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" (UID: "97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8"). InnerVolumeSpecName "kube-api-access-scxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.012122 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" (UID: "97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.044394 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.044421 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scxk4\" (UniqueName: \"kubernetes.io/projected/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8-kube-api-access-scxk4\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.385155 4626 generic.go:334] "Generic (PLEG): container finished" podID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerID="f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7" exitCode=0 Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.385195 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerDied","Data":"f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7"} Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.385244 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrhcw" event={"ID":"97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8","Type":"ContainerDied","Data":"4ff4e2008ca1c0eb5d7b6732ee999de52a84c3d11a830d7dc3deb602fb9bdb6a"} Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.385265 4626 scope.go:117] "RemoveContainer" containerID="f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.385271 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrhcw" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.409460 4626 scope.go:117] "RemoveContainer" containerID="33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.417906 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrhcw"] Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.425240 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hrhcw"] Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.435166 4626 scope.go:117] "RemoveContainer" containerID="1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.462906 4626 scope.go:117] "RemoveContainer" containerID="f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7" Feb 23 07:45:05 crc kubenswrapper[4626]: E0223 07:45:05.464344 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7\": container with ID starting with f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7 not found: ID does not exist" containerID="f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.464410 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7"} err="failed to get container status \"f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7\": rpc error: code = NotFound desc = could not find container \"f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7\": container with ID starting with f77db4a015563cc0c638a12af46986fd4f9283754c189c0a658542e16cb64ad7 not found: ID does not exist" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.464463 4626 scope.go:117] "RemoveContainer" containerID="33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa" Feb 23 07:45:05 crc kubenswrapper[4626]: E0223 07:45:05.465027 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa\": container with ID starting with 33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa not found: ID does not exist" containerID="33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.465062 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa"} err="failed to get container status \"33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa\": rpc error: code = NotFound desc = could not find container \"33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa\": container with ID starting with 33be17df38034b636d6c83a34fda161ae9776adc4aa88154f0bda8e3594d49aa not found: ID does not exist" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.465084 4626 scope.go:117] "RemoveContainer" containerID="1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1" Feb 23 07:45:05 crc kubenswrapper[4626]: E0223 07:45:05.465429 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1\": container with ID starting with 1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1 not found: ID does not exist" containerID="1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.465464 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1"} err="failed to get container status \"1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1\": rpc error: code = NotFound desc = could not find container \"1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1\": container with ID starting with 1f602198aecbb406846f47d7a1f2dd2b57f54d89a6dfe1f3e622496d95a4b5d1 not found: ID does not exist" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.994584 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0" path="/var/lib/kubelet/pods/8a2e0aa8-65ca-4096-8dd3-fb2e73f7c3c0/volumes" Feb 23 07:45:05 crc kubenswrapper[4626]: I0223 07:45:05.996997 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" path="/var/lib/kubelet/pods/97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8/volumes" Feb 23 07:45:12 crc kubenswrapper[4626]: I0223 07:45:12.982664 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:45:12 crc kubenswrapper[4626]: E0223 07:45:12.983155 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:45:22 crc kubenswrapper[4626]: I0223 07:45:22.541842 4626 scope.go:117] "RemoveContainer" containerID="de8772856d5052fc2b5063c72cb46e6bbfc902f8ffc0916450590bcafe4071cd" Feb 23 07:45:25 crc kubenswrapper[4626]: I0223 07:45:25.982242 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:45:25 crc kubenswrapper[4626]: E0223 07:45:25.983044 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:45:36 crc kubenswrapper[4626]: I0223 07:45:36.981966 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:45:36 crc kubenswrapper[4626]: E0223 07:45:36.982673 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:45:50 crc kubenswrapper[4626]: I0223 07:45:50.982487 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:45:50 crc kubenswrapper[4626]: E0223 07:45:50.984675 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:46:04 crc kubenswrapper[4626]: I0223 07:46:04.982550 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:46:04 crc kubenswrapper[4626]: E0223 07:46:04.983450 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:46:18 crc kubenswrapper[4626]: I0223 07:46:18.982088 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:46:18 crc kubenswrapper[4626]: E0223 07:46:18.984676 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:46:33 crc kubenswrapper[4626]: I0223 07:46:33.982227 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:46:33 crc kubenswrapper[4626]: E0223 07:46:33.983010 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:46:46 crc kubenswrapper[4626]: I0223 07:46:46.981977 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:46:46 crc kubenswrapper[4626]: E0223 07:46:46.982563 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:47:00 crc kubenswrapper[4626]: I0223 07:47:00.983218 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:47:00 crc kubenswrapper[4626]: E0223 07:47:00.983754 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:47:11 crc kubenswrapper[4626]: I0223 07:47:11.982462 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:47:11 crc kubenswrapper[4626]: E0223 07:47:11.983058 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:47:26 crc kubenswrapper[4626]: I0223 07:47:26.982294 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:47:26 crc kubenswrapper[4626]: E0223 07:47:26.982998 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:47:41 crc kubenswrapper[4626]: I0223 07:47:41.982284 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:47:41 crc kubenswrapper[4626]: E0223 07:47:41.983048 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:47:52 crc kubenswrapper[4626]: I0223 07:47:52.982034 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:47:52 crc kubenswrapper[4626]: E0223 07:47:52.982753 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:48:04 crc kubenswrapper[4626]: I0223 07:48:04.982490 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:48:05 crc kubenswrapper[4626]: I0223 07:48:05.752430 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"770490da4aa00de65c536d96532e92f0bfe04f6b4727040a71d0281e00477008"} Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.150564 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlbsz"] Feb 23 07:49:51 crc kubenswrapper[4626]: E0223 07:49:51.152523 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="extract-utilities" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.152762 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="extract-utilities" Feb 23 07:49:51 crc kubenswrapper[4626]: E0223 07:49:51.152811 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a56899-cb3c-4da0-bcb1-af450262d173" containerName="collect-profiles" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.152817 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a56899-cb3c-4da0-bcb1-af450262d173" containerName="collect-profiles" Feb 23 07:49:51 crc kubenswrapper[4626]: E0223 07:49:51.152828 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="registry-server" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.152834 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="registry-server" Feb 23 07:49:51 crc kubenswrapper[4626]: E0223 07:49:51.152845 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="extract-content" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.152851 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="extract-content" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.153105 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b1edfc-0c0d-4f77-b7d2-8ea27c8e70f8" containerName="registry-server" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.153122 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a56899-cb3c-4da0-bcb1-af450262d173" containerName="collect-profiles" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.154676 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.251881 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-catalog-content\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.251959 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4qb\" (UniqueName: \"kubernetes.io/projected/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-kube-api-access-7b4qb\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.252053 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-utilities\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.272354 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlbsz"] Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.353684 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-catalog-content\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.354104 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-catalog-content\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.354217 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4qb\" (UniqueName: \"kubernetes.io/projected/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-kube-api-access-7b4qb\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.354330 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-utilities\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.354709 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-utilities\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.382767 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4qb\" (UniqueName: \"kubernetes.io/projected/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-kube-api-access-7b4qb\") pod \"redhat-operators-jlbsz\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:51 crc kubenswrapper[4626]: I0223 07:49:51.468979 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:49:52 crc kubenswrapper[4626]: I0223 07:49:52.190966 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlbsz"] Feb 23 07:49:52 crc kubenswrapper[4626]: W0223 07:49:52.202125 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode617bb4c_e6e3_49fc_b67c_d08cd738a3b1.slice/crio-5f332663e20458c778683280eb6c2c355b1f47ded755a7bdea11e0989b543e3e WatchSource:0}: Error finding container 5f332663e20458c778683280eb6c2c355b1f47ded755a7bdea11e0989b543e3e: Status 404 returned error can't find the container with id 5f332663e20458c778683280eb6c2c355b1f47ded755a7bdea11e0989b543e3e Feb 23 07:49:52 crc kubenswrapper[4626]: I0223 07:49:52.490946 4626 generic.go:334] "Generic (PLEG): container finished" podID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerID="986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4" exitCode=0 Feb 23 07:49:52 crc kubenswrapper[4626]: I0223 07:49:52.491152 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerDied","Data":"986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4"} Feb 23 07:49:52 crc kubenswrapper[4626]: I0223 07:49:52.491243 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerStarted","Data":"5f332663e20458c778683280eb6c2c355b1f47ded755a7bdea11e0989b543e3e"} Feb 23 07:49:52 crc kubenswrapper[4626]: I0223 07:49:52.497060 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:49:53 crc kubenswrapper[4626]: I0223 07:49:53.514464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerStarted","Data":"4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55"} Feb 23 07:49:56 crc kubenswrapper[4626]: I0223 07:49:56.555650 4626 generic.go:334] "Generic (PLEG): container finished" podID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerID="4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55" exitCode=0 Feb 23 07:49:56 crc kubenswrapper[4626]: I0223 07:49:56.555759 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerDied","Data":"4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55"} Feb 23 07:49:57 crc kubenswrapper[4626]: I0223 07:49:57.568183 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerStarted","Data":"2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8"} Feb 23 07:49:57 crc kubenswrapper[4626]: I0223 07:49:57.593706 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlbsz" podStartSLOduration=2.051772873 podStartE2EDuration="6.593679315s" podCreationTimestamp="2026-02-23 07:49:51 +0000 UTC" firstStartedPulling="2026-02-23 07:49:52.492950817 +0000 UTC m=+4144.832280083" lastFinishedPulling="2026-02-23 07:49:57.034857238 +0000 UTC m=+4149.374186525" observedRunningTime="2026-02-23 07:49:57.584542859 +0000 UTC m=+4149.923872125" watchObservedRunningTime="2026-02-23 07:49:57.593679315 +0000 UTC m=+4149.933008582" Feb 23 07:50:01 crc kubenswrapper[4626]: I0223 07:50:01.470236 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:50:01 crc kubenswrapper[4626]: I0223 07:50:01.470829 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:50:02 crc kubenswrapper[4626]: I0223 07:50:02.525767 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlbsz" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="registry-server" probeResult="failure" output=< Feb 23 07:50:02 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:50:02 crc kubenswrapper[4626]: > Feb 23 07:50:12 crc kubenswrapper[4626]: I0223 07:50:12.511537 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlbsz" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="registry-server" probeResult="failure" output=< Feb 23 07:50:12 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 07:50:12 crc kubenswrapper[4626]: > Feb 23 07:50:21 crc kubenswrapper[4626]: I0223 07:50:21.506146 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:50:21 crc kubenswrapper[4626]: I0223 07:50:21.547651 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:50:22 crc kubenswrapper[4626]: I0223 07:50:22.304230 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlbsz"] Feb 23 07:50:22 crc kubenswrapper[4626]: I0223 07:50:22.776251 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlbsz" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="registry-server" containerID="cri-o://2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8" gracePeriod=2 Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.446461 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.618933 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-utilities\") pod \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.619199 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b4qb\" (UniqueName: \"kubernetes.io/projected/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-kube-api-access-7b4qb\") pod \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.619352 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-catalog-content\") pod \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\" (UID: \"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1\") " Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.621115 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-utilities" (OuterVolumeSpecName: "utilities") pod "e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" (UID: "e617bb4c-e6e3-49fc-b67c-d08cd738a3b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.640480 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-kube-api-access-7b4qb" (OuterVolumeSpecName: "kube-api-access-7b4qb") pod "e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" (UID: "e617bb4c-e6e3-49fc-b67c-d08cd738a3b1"). InnerVolumeSpecName "kube-api-access-7b4qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.724098 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.724257 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b4qb\" (UniqueName: \"kubernetes.io/projected/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-kube-api-access-7b4qb\") on node \"crc\" DevicePath \"\"" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.735362 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" (UID: "e617bb4c-e6e3-49fc-b67c-d08cd738a3b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.797309 4626 generic.go:334] "Generic (PLEG): container finished" podID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerID="2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8" exitCode=0 Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.797399 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerDied","Data":"2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8"} Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.797449 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbsz" event={"ID":"e617bb4c-e6e3-49fc-b67c-d08cd738a3b1","Type":"ContainerDied","Data":"5f332663e20458c778683280eb6c2c355b1f47ded755a7bdea11e0989b543e3e"} Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.797779 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbsz" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.798124 4626 scope.go:117] "RemoveContainer" containerID="2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.829313 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.832367 4626 scope.go:117] "RemoveContainer" containerID="4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.847603 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlbsz"] Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.852136 4626 scope.go:117] "RemoveContainer" containerID="986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.862998 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlbsz"] Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.888644 4626 scope.go:117] "RemoveContainer" containerID="2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8" Feb 23 07:50:23 crc kubenswrapper[4626]: E0223 07:50:23.891175 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8\": container with ID starting with 2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8 not found: ID does not exist" containerID="2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.892286 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8"} err="failed to get container status \"2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8\": rpc error: code = NotFound desc = could not find container \"2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8\": container with ID starting with 2bf624d0b5b6ce2836202f1fa2d3377dfc41d22b2ff8cee0d32d31504e55a3d8 not found: ID does not exist" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.892402 4626 scope.go:117] "RemoveContainer" containerID="4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55" Feb 23 07:50:23 crc kubenswrapper[4626]: E0223 07:50:23.893215 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55\": container with ID starting with 4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55 not found: ID does not exist" containerID="4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.893300 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55"} err="failed to get container status \"4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55\": rpc error: code = NotFound desc = could not find container \"4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55\": container with ID starting with 4146b592bf3d71387c11011005c23d25a4fc3d1e048cc5af9d45a3fe2a087f55 not found: ID does not exist" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.893366 4626 scope.go:117] "RemoveContainer" containerID="986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4" Feb 23 07:50:23 crc kubenswrapper[4626]: E0223 07:50:23.893953 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4\": container with ID starting with 986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4 not found: ID does not exist" containerID="986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.894012 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4"} err="failed to get container status \"986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4\": rpc error: code = NotFound desc = could not find container \"986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4\": container with ID starting with 986b51be1c75ad7f160752fbc9685b24b11cd8bdef15073bd0c78f4a83cbf1a4 not found: ID does not exist" Feb 23 07:50:23 crc kubenswrapper[4626]: I0223 07:50:23.991415 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" path="/var/lib/kubelet/pods/e617bb4c-e6e3-49fc-b67c-d08cd738a3b1/volumes" Feb 23 07:50:25 crc kubenswrapper[4626]: I0223 07:50:25.685533 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:50:25 crc kubenswrapper[4626]: I0223 07:50:25.686304 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:50:55 crc kubenswrapper[4626]: I0223 07:50:55.685965 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:50:55 crc kubenswrapper[4626]: I0223 07:50:55.687435 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:51:25 crc kubenswrapper[4626]: I0223 07:51:25.685649 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:51:25 crc kubenswrapper[4626]: I0223 07:51:25.686452 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:51:25 crc kubenswrapper[4626]: I0223 07:51:25.686552 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:51:25 crc kubenswrapper[4626]: I0223 07:51:25.687971 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"770490da4aa00de65c536d96532e92f0bfe04f6b4727040a71d0281e00477008"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:51:25 crc kubenswrapper[4626]: I0223 07:51:25.688056 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://770490da4aa00de65c536d96532e92f0bfe04f6b4727040a71d0281e00477008" gracePeriod=600 Feb 23 07:51:26 crc kubenswrapper[4626]: I0223 07:51:26.301849 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="770490da4aa00de65c536d96532e92f0bfe04f6b4727040a71d0281e00477008" exitCode=0 Feb 23 07:51:26 crc kubenswrapper[4626]: I0223 07:51:26.302197 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"770490da4aa00de65c536d96532e92f0bfe04f6b4727040a71d0281e00477008"} Feb 23 07:51:26 crc kubenswrapper[4626]: I0223 07:51:26.302229 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c"} Feb 23 07:51:26 crc kubenswrapper[4626]: I0223 07:51:26.302247 4626 scope.go:117] "RemoveContainer" containerID="872daa29fd6c118dfe3bda414a01e77e2cc5c2431b1f5e7acb2a47b12c43598f" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.493770 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pqdwj"] Feb 23 07:51:49 crc kubenswrapper[4626]: E0223 07:51:49.495648 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="extract-utilities" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.495677 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="extract-utilities" Feb 23 07:51:49 crc kubenswrapper[4626]: E0223 07:51:49.495742 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="registry-server" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.495750 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="registry-server" Feb 23 07:51:49 crc kubenswrapper[4626]: E0223 07:51:49.495778 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="extract-content" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.495786 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="extract-content" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.496385 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e617bb4c-e6e3-49fc-b67c-d08cd738a3b1" containerName="registry-server" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.499606 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.516477 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqdwj"] Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.560548 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-utilities\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.560821 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbqn9\" (UniqueName: \"kubernetes.io/projected/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-kube-api-access-cbqn9\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.560906 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-catalog-content\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.662804 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-utilities\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.662929 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbqn9\" (UniqueName: \"kubernetes.io/projected/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-kube-api-access-cbqn9\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.662967 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-catalog-content\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.665523 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-utilities\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.665860 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-catalog-content\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.682013 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbqn9\" (UniqueName: \"kubernetes.io/projected/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-kube-api-access-cbqn9\") pod \"community-operators-pqdwj\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:49 crc kubenswrapper[4626]: I0223 07:51:49.822171 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:50 crc kubenswrapper[4626]: I0223 07:51:50.258671 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pqdwj"] Feb 23 07:51:50 crc kubenswrapper[4626]: I0223 07:51:50.479565 4626 generic.go:334] "Generic (PLEG): container finished" podID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerID="0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610" exitCode=0 Feb 23 07:51:50 crc kubenswrapper[4626]: I0223 07:51:50.479712 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerDied","Data":"0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610"} Feb 23 07:51:50 crc kubenswrapper[4626]: I0223 07:51:50.479900 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerStarted","Data":"9536cb1b0a6a324f546da66278b3a096765cf0c1bd0b4610b032d2457d3233f7"} Feb 23 07:51:51 crc kubenswrapper[4626]: I0223 07:51:51.488929 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerStarted","Data":"31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36"} Feb 23 07:51:52 crc kubenswrapper[4626]: I0223 07:51:52.496693 4626 generic.go:334] "Generic (PLEG): container finished" podID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerID="31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36" exitCode=0 Feb 23 07:51:52 crc kubenswrapper[4626]: I0223 07:51:52.496746 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerDied","Data":"31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36"} Feb 23 07:51:53 crc kubenswrapper[4626]: I0223 07:51:53.506061 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerStarted","Data":"5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb"} Feb 23 07:51:53 crc kubenswrapper[4626]: I0223 07:51:53.527118 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pqdwj" podStartSLOduration=1.984807305 podStartE2EDuration="4.526838837s" podCreationTimestamp="2026-02-23 07:51:49 +0000 UTC" firstStartedPulling="2026-02-23 07:51:50.481910294 +0000 UTC m=+4262.821239560" lastFinishedPulling="2026-02-23 07:51:53.023941826 +0000 UTC m=+4265.363271092" observedRunningTime="2026-02-23 07:51:53.520121542 +0000 UTC m=+4265.859450808" watchObservedRunningTime="2026-02-23 07:51:53.526838837 +0000 UTC m=+4265.866168103" Feb 23 07:51:59 crc kubenswrapper[4626]: I0223 07:51:59.822690 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:59 crc kubenswrapper[4626]: I0223 07:51:59.823900 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:51:59 crc kubenswrapper[4626]: I0223 07:51:59.862883 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:52:00 crc kubenswrapper[4626]: I0223 07:52:00.597041 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:52:00 crc kubenswrapper[4626]: I0223 07:52:00.640864 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pqdwj"] Feb 23 07:52:02 crc kubenswrapper[4626]: I0223 07:52:02.577012 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pqdwj" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="registry-server" containerID="cri-o://5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb" gracePeriod=2 Feb 23 07:52:02 crc kubenswrapper[4626]: I0223 07:52:02.999664 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.143035 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbqn9\" (UniqueName: \"kubernetes.io/projected/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-kube-api-access-cbqn9\") pod \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.143363 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-catalog-content\") pod \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.143409 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-utilities\") pod \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\" (UID: \"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7\") " Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.144087 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-utilities" (OuterVolumeSpecName: "utilities") pod "38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" (UID: "38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.144558 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.149127 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-kube-api-access-cbqn9" (OuterVolumeSpecName: "kube-api-access-cbqn9") pod "38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" (UID: "38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7"). InnerVolumeSpecName "kube-api-access-cbqn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.187394 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" (UID: "38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.246761 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbqn9\" (UniqueName: \"kubernetes.io/projected/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-kube-api-access-cbqn9\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.246796 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.586940 4626 generic.go:334] "Generic (PLEG): container finished" podID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerID="5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb" exitCode=0 Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.586985 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerDied","Data":"5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb"} Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.587006 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pqdwj" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.587026 4626 scope.go:117] "RemoveContainer" containerID="5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.587011 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pqdwj" event={"ID":"38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7","Type":"ContainerDied","Data":"9536cb1b0a6a324f546da66278b3a096765cf0c1bd0b4610b032d2457d3233f7"} Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.614253 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pqdwj"] Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.618036 4626 scope.go:117] "RemoveContainer" containerID="31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.621970 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pqdwj"] Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.719241 4626 scope.go:117] "RemoveContainer" containerID="0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.770969 4626 scope.go:117] "RemoveContainer" containerID="5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb" Feb 23 07:52:03 crc kubenswrapper[4626]: E0223 07:52:03.771710 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb\": container with ID starting with 5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb not found: ID does not exist" containerID="5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.771749 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb"} err="failed to get container status \"5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb\": rpc error: code = NotFound desc = could not find container \"5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb\": container with ID starting with 5ccaba46568269f26d25dc32f7ea011f1c8544818533df731742d72dcd3beccb not found: ID does not exist" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.771772 4626 scope.go:117] "RemoveContainer" containerID="31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36" Feb 23 07:52:03 crc kubenswrapper[4626]: E0223 07:52:03.772253 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36\": container with ID starting with 31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36 not found: ID does not exist" containerID="31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.772290 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36"} err="failed to get container status \"31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36\": rpc error: code = NotFound desc = could not find container \"31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36\": container with ID starting with 31a5480fa598bda6c567b25c1f21b9ee9022c854f889525df92461ee4ef41d36 not found: ID does not exist" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.772318 4626 scope.go:117] "RemoveContainer" containerID="0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610" Feb 23 07:52:03 crc kubenswrapper[4626]: E0223 07:52:03.773136 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610\": container with ID starting with 0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610 not found: ID does not exist" containerID="0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.773167 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610"} err="failed to get container status \"0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610\": rpc error: code = NotFound desc = could not find container \"0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610\": container with ID starting with 0208fdebf0a9a4420a21fbb91f402f4d1bb5ca851bcab166bdb747d2b9871610 not found: ID does not exist" Feb 23 07:52:03 crc kubenswrapper[4626]: I0223 07:52:03.994457 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" path="/var/lib/kubelet/pods/38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7/volumes" Feb 23 07:53:25 crc kubenswrapper[4626]: I0223 07:53:25.685481 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:53:25 crc kubenswrapper[4626]: I0223 07:53:25.686173 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:53:55 crc kubenswrapper[4626]: I0223 07:53:55.685728 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:53:55 crc kubenswrapper[4626]: I0223 07:53:55.686799 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.180855 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fzjsg"] Feb 23 07:54:01 crc kubenswrapper[4626]: E0223 07:54:01.181703 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="extract-utilities" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.181718 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="extract-utilities" Feb 23 07:54:01 crc kubenswrapper[4626]: E0223 07:54:01.181729 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="extract-content" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.181735 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="extract-content" Feb 23 07:54:01 crc kubenswrapper[4626]: E0223 07:54:01.181766 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="registry-server" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.181772 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="registry-server" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.181930 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d8ef44-e9fb-4c81-b7e4-04cdb691ccb7" containerName="registry-server" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.183241 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.208135 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzjsg"] Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.343718 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-utilities\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.343786 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dlh\" (UniqueName: \"kubernetes.io/projected/d93d629e-8112-412f-a0f8-48b90b347850-kube-api-access-z6dlh\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.343946 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-catalog-content\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.448558 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dlh\" (UniqueName: \"kubernetes.io/projected/d93d629e-8112-412f-a0f8-48b90b347850-kube-api-access-z6dlh\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.448632 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-catalog-content\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.448756 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-utilities\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.449107 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-utilities\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.450326 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-catalog-content\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.471191 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dlh\" (UniqueName: \"kubernetes.io/projected/d93d629e-8112-412f-a0f8-48b90b347850-kube-api-access-z6dlh\") pod \"redhat-marketplace-fzjsg\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.498853 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:01 crc kubenswrapper[4626]: I0223 07:54:01.981358 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzjsg"] Feb 23 07:54:02 crc kubenswrapper[4626]: I0223 07:54:02.603666 4626 generic.go:334] "Generic (PLEG): container finished" podID="d93d629e-8112-412f-a0f8-48b90b347850" containerID="a190a352b2b65d5ba9d84d69634e8394ec3e89e9e7229f9f315eabb6dbca811f" exitCode=0 Feb 23 07:54:02 crc kubenswrapper[4626]: I0223 07:54:02.603767 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerDied","Data":"a190a352b2b65d5ba9d84d69634e8394ec3e89e9e7229f9f315eabb6dbca811f"} Feb 23 07:54:02 crc kubenswrapper[4626]: I0223 07:54:02.604013 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerStarted","Data":"cafe55f318da5b3ad177978a25f551c6e1af664f0f157786397fb28af4cf64fb"} Feb 23 07:54:03 crc kubenswrapper[4626]: I0223 07:54:03.613147 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerStarted","Data":"f2d2d00c11d34f67261cdcca8451891730e3b3baae87591696fd8fc04cd293e1"} Feb 23 07:54:04 crc kubenswrapper[4626]: I0223 07:54:04.622731 4626 generic.go:334] "Generic (PLEG): container finished" podID="d93d629e-8112-412f-a0f8-48b90b347850" containerID="f2d2d00c11d34f67261cdcca8451891730e3b3baae87591696fd8fc04cd293e1" exitCode=0 Feb 23 07:54:04 crc kubenswrapper[4626]: I0223 07:54:04.622786 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerDied","Data":"f2d2d00c11d34f67261cdcca8451891730e3b3baae87591696fd8fc04cd293e1"} Feb 23 07:54:05 crc kubenswrapper[4626]: I0223 07:54:05.631780 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerStarted","Data":"956a9392aec7aa951088c429d41337f14a96af8675ddc0b8bc367bc986b0e8f4"} Feb 23 07:54:05 crc kubenswrapper[4626]: I0223 07:54:05.650471 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fzjsg" podStartSLOduration=2.092664298 podStartE2EDuration="4.650459653s" podCreationTimestamp="2026-02-23 07:54:01 +0000 UTC" firstStartedPulling="2026-02-23 07:54:02.606307875 +0000 UTC m=+4394.945637141" lastFinishedPulling="2026-02-23 07:54:05.16410323 +0000 UTC m=+4397.503432496" observedRunningTime="2026-02-23 07:54:05.647545359 +0000 UTC m=+4397.986874625" watchObservedRunningTime="2026-02-23 07:54:05.650459653 +0000 UTC m=+4397.989788920" Feb 23 07:54:11 crc kubenswrapper[4626]: I0223 07:54:11.500143 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:11 crc kubenswrapper[4626]: I0223 07:54:11.500682 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:11 crc kubenswrapper[4626]: I0223 07:54:11.533881 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:11 crc kubenswrapper[4626]: I0223 07:54:11.724780 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:14 crc kubenswrapper[4626]: I0223 07:54:14.570217 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzjsg"] Feb 23 07:54:14 crc kubenswrapper[4626]: I0223 07:54:14.571216 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fzjsg" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="registry-server" containerID="cri-o://956a9392aec7aa951088c429d41337f14a96af8675ddc0b8bc367bc986b0e8f4" gracePeriod=2 Feb 23 07:54:14 crc kubenswrapper[4626]: I0223 07:54:14.730697 4626 generic.go:334] "Generic (PLEG): container finished" podID="d93d629e-8112-412f-a0f8-48b90b347850" containerID="956a9392aec7aa951088c429d41337f14a96af8675ddc0b8bc367bc986b0e8f4" exitCode=0 Feb 23 07:54:14 crc kubenswrapper[4626]: I0223 07:54:14.730754 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerDied","Data":"956a9392aec7aa951088c429d41337f14a96af8675ddc0b8bc367bc986b0e8f4"} Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.240225 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.293011 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-utilities\") pod \"d93d629e-8112-412f-a0f8-48b90b347850\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.293150 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6dlh\" (UniqueName: \"kubernetes.io/projected/d93d629e-8112-412f-a0f8-48b90b347850-kube-api-access-z6dlh\") pod \"d93d629e-8112-412f-a0f8-48b90b347850\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.293216 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-catalog-content\") pod \"d93d629e-8112-412f-a0f8-48b90b347850\" (UID: \"d93d629e-8112-412f-a0f8-48b90b347850\") " Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.293786 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-utilities" (OuterVolumeSpecName: "utilities") pod "d93d629e-8112-412f-a0f8-48b90b347850" (UID: "d93d629e-8112-412f-a0f8-48b90b347850"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.294318 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.306739 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93d629e-8112-412f-a0f8-48b90b347850-kube-api-access-z6dlh" (OuterVolumeSpecName: "kube-api-access-z6dlh") pod "d93d629e-8112-412f-a0f8-48b90b347850" (UID: "d93d629e-8112-412f-a0f8-48b90b347850"). InnerVolumeSpecName "kube-api-access-z6dlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.313303 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d93d629e-8112-412f-a0f8-48b90b347850" (UID: "d93d629e-8112-412f-a0f8-48b90b347850"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.397006 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6dlh\" (UniqueName: \"kubernetes.io/projected/d93d629e-8112-412f-a0f8-48b90b347850-kube-api-access-z6dlh\") on node \"crc\" DevicePath \"\"" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.397043 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d93d629e-8112-412f-a0f8-48b90b347850-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.743592 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fzjsg" event={"ID":"d93d629e-8112-412f-a0f8-48b90b347850","Type":"ContainerDied","Data":"cafe55f318da5b3ad177978a25f551c6e1af664f0f157786397fb28af4cf64fb"} Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.743659 4626 scope.go:117] "RemoveContainer" containerID="956a9392aec7aa951088c429d41337f14a96af8675ddc0b8bc367bc986b0e8f4" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.743712 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fzjsg" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.766060 4626 scope.go:117] "RemoveContainer" containerID="f2d2d00c11d34f67261cdcca8451891730e3b3baae87591696fd8fc04cd293e1" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.781971 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzjsg"] Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.786968 4626 scope.go:117] "RemoveContainer" containerID="a190a352b2b65d5ba9d84d69634e8394ec3e89e9e7229f9f315eabb6dbca811f" Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.791544 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fzjsg"] Feb 23 07:54:15 crc kubenswrapper[4626]: I0223 07:54:15.993264 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93d629e-8112-412f-a0f8-48b90b347850" path="/var/lib/kubelet/pods/d93d629e-8112-412f-a0f8-48b90b347850/volumes" Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.685582 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.686175 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.686222 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.686949 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.687002 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" gracePeriod=600 Feb 23 07:54:25 crc kubenswrapper[4626]: E0223 07:54:25.804403 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.842799 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" exitCode=0 Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.842832 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c"} Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.842884 4626 scope.go:117] "RemoveContainer" containerID="770490da4aa00de65c536d96532e92f0bfe04f6b4727040a71d0281e00477008" Feb 23 07:54:25 crc kubenswrapper[4626]: I0223 07:54:25.843554 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:54:25 crc kubenswrapper[4626]: E0223 07:54:25.843865 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:54:38 crc kubenswrapper[4626]: I0223 07:54:38.982752 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:54:38 crc kubenswrapper[4626]: E0223 07:54:38.983701 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.046963 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gjrpr"] Feb 23 07:54:53 crc kubenswrapper[4626]: E0223 07:54:53.047823 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="extract-utilities" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.047837 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="extract-utilities" Feb 23 07:54:53 crc kubenswrapper[4626]: E0223 07:54:53.047848 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="registry-server" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.047854 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="registry-server" Feb 23 07:54:53 crc kubenswrapper[4626]: E0223 07:54:53.047871 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="extract-content" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.047877 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="extract-content" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.048055 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93d629e-8112-412f-a0f8-48b90b347850" containerName="registry-server" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.049279 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.058297 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gjrpr"] Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.178702 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z6k4\" (UniqueName: \"kubernetes.io/projected/4122031e-f9b6-4efa-8596-bee15490b6cd-kube-api-access-2z6k4\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.178786 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-catalog-content\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.178990 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-utilities\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.281085 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-catalog-content\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.281236 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-utilities\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.281556 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z6k4\" (UniqueName: \"kubernetes.io/projected/4122031e-f9b6-4efa-8596-bee15490b6cd-kube-api-access-2z6k4\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.281685 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-catalog-content\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.281722 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-utilities\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.298017 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z6k4\" (UniqueName: \"kubernetes.io/projected/4122031e-f9b6-4efa-8596-bee15490b6cd-kube-api-access-2z6k4\") pod \"certified-operators-gjrpr\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.365515 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.780650 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gjrpr"] Feb 23 07:54:53 crc kubenswrapper[4626]: W0223 07:54:53.786278 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4122031e_f9b6_4efa_8596_bee15490b6cd.slice/crio-b2b465eefc4e35dda30e54d816244f52a7fb78a4f94fcedecbc3eef8a2d12720 WatchSource:0}: Error finding container b2b465eefc4e35dda30e54d816244f52a7fb78a4f94fcedecbc3eef8a2d12720: Status 404 returned error can't find the container with id b2b465eefc4e35dda30e54d816244f52a7fb78a4f94fcedecbc3eef8a2d12720 Feb 23 07:54:53 crc kubenswrapper[4626]: I0223 07:54:53.985781 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:54:53 crc kubenswrapper[4626]: E0223 07:54:53.986247 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:54:54 crc kubenswrapper[4626]: I0223 07:54:54.090107 4626 generic.go:334] "Generic (PLEG): container finished" podID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerID="bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2" exitCode=0 Feb 23 07:54:54 crc kubenswrapper[4626]: I0223 07:54:54.090163 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerDied","Data":"bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2"} Feb 23 07:54:54 crc kubenswrapper[4626]: I0223 07:54:54.090196 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerStarted","Data":"b2b465eefc4e35dda30e54d816244f52a7fb78a4f94fcedecbc3eef8a2d12720"} Feb 23 07:54:54 crc kubenswrapper[4626]: I0223 07:54:54.093776 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:54:55 crc kubenswrapper[4626]: I0223 07:54:55.102485 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerStarted","Data":"8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5"} Feb 23 07:54:56 crc kubenswrapper[4626]: I0223 07:54:56.112686 4626 generic.go:334] "Generic (PLEG): container finished" podID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerID="8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5" exitCode=0 Feb 23 07:54:56 crc kubenswrapper[4626]: I0223 07:54:56.112727 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerDied","Data":"8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5"} Feb 23 07:54:57 crc kubenswrapper[4626]: I0223 07:54:57.125256 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerStarted","Data":"bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5"} Feb 23 07:54:57 crc kubenswrapper[4626]: I0223 07:54:57.146344 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gjrpr" podStartSLOduration=1.626015161 podStartE2EDuration="4.146321551s" podCreationTimestamp="2026-02-23 07:54:53 +0000 UTC" firstStartedPulling="2026-02-23 07:54:54.092716068 +0000 UTC m=+4446.432045334" lastFinishedPulling="2026-02-23 07:54:56.613022457 +0000 UTC m=+4448.952351724" observedRunningTime="2026-02-23 07:54:57.143900808 +0000 UTC m=+4449.483230074" watchObservedRunningTime="2026-02-23 07:54:57.146321551 +0000 UTC m=+4449.485650818" Feb 23 07:55:03 crc kubenswrapper[4626]: I0223 07:55:03.366408 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:55:03 crc kubenswrapper[4626]: I0223 07:55:03.367059 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:55:03 crc kubenswrapper[4626]: I0223 07:55:03.415023 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:55:04 crc kubenswrapper[4626]: I0223 07:55:04.223256 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:55:04 crc kubenswrapper[4626]: I0223 07:55:04.267588 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gjrpr"] Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.200440 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gjrpr" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="registry-server" containerID="cri-o://bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5" gracePeriod=2 Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.647766 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.711945 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-catalog-content\") pod \"4122031e-f9b6-4efa-8596-bee15490b6cd\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.712067 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-utilities\") pod \"4122031e-f9b6-4efa-8596-bee15490b6cd\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.712124 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z6k4\" (UniqueName: \"kubernetes.io/projected/4122031e-f9b6-4efa-8596-bee15490b6cd-kube-api-access-2z6k4\") pod \"4122031e-f9b6-4efa-8596-bee15490b6cd\" (UID: \"4122031e-f9b6-4efa-8596-bee15490b6cd\") " Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.713132 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-utilities" (OuterVolumeSpecName: "utilities") pod "4122031e-f9b6-4efa-8596-bee15490b6cd" (UID: "4122031e-f9b6-4efa-8596-bee15490b6cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.720986 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4122031e-f9b6-4efa-8596-bee15490b6cd-kube-api-access-2z6k4" (OuterVolumeSpecName: "kube-api-access-2z6k4") pod "4122031e-f9b6-4efa-8596-bee15490b6cd" (UID: "4122031e-f9b6-4efa-8596-bee15490b6cd"). InnerVolumeSpecName "kube-api-access-2z6k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.761603 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4122031e-f9b6-4efa-8596-bee15490b6cd" (UID: "4122031e-f9b6-4efa-8596-bee15490b6cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.816726 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.816762 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z6k4\" (UniqueName: \"kubernetes.io/projected/4122031e-f9b6-4efa-8596-bee15490b6cd-kube-api-access-2z6k4\") on node \"crc\" DevicePath \"\"" Feb 23 07:55:06 crc kubenswrapper[4626]: I0223 07:55:06.816778 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4122031e-f9b6-4efa-8596-bee15490b6cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.210585 4626 generic.go:334] "Generic (PLEG): container finished" podID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerID="bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5" exitCode=0 Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.210638 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerDied","Data":"bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5"} Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.210974 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjrpr" event={"ID":"4122031e-f9b6-4efa-8596-bee15490b6cd","Type":"ContainerDied","Data":"b2b465eefc4e35dda30e54d816244f52a7fb78a4f94fcedecbc3eef8a2d12720"} Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.210665 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjrpr" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.211004 4626 scope.go:117] "RemoveContainer" containerID="bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.244069 4626 scope.go:117] "RemoveContainer" containerID="8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.247379 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gjrpr"] Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.270325 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gjrpr"] Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.277801 4626 scope.go:117] "RemoveContainer" containerID="bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.311756 4626 scope.go:117] "RemoveContainer" containerID="bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5" Feb 23 07:55:07 crc kubenswrapper[4626]: E0223 07:55:07.312224 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5\": container with ID starting with bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5 not found: ID does not exist" containerID="bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.312273 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5"} err="failed to get container status \"bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5\": rpc error: code = NotFound desc = could not find container \"bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5\": container with ID starting with bb411a4e32b8a68c8920f4745dd390466526f1c3ac89fc3b32e17302ab243ba5 not found: ID does not exist" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.312307 4626 scope.go:117] "RemoveContainer" containerID="8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5" Feb 23 07:55:07 crc kubenswrapper[4626]: E0223 07:55:07.313423 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5\": container with ID starting with 8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5 not found: ID does not exist" containerID="8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.313737 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5"} err="failed to get container status \"8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5\": rpc error: code = NotFound desc = could not find container \"8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5\": container with ID starting with 8745a18cf55ec2b9986b0f99235b7530d6b8123a2d1cbe6b80e4fad65b3724e5 not found: ID does not exist" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.313766 4626 scope.go:117] "RemoveContainer" containerID="bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2" Feb 23 07:55:07 crc kubenswrapper[4626]: E0223 07:55:07.314099 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2\": container with ID starting with bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2 not found: ID does not exist" containerID="bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.314123 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2"} err="failed to get container status \"bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2\": rpc error: code = NotFound desc = could not find container \"bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2\": container with ID starting with bbc0f8c7a79f8a76fd08265f6c9ea69d0500427b21fb4347b4405db836d797d2 not found: ID does not exist" Feb 23 07:55:07 crc kubenswrapper[4626]: I0223 07:55:07.991921 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" path="/var/lib/kubelet/pods/4122031e-f9b6-4efa-8596-bee15490b6cd/volumes" Feb 23 07:55:08 crc kubenswrapper[4626]: I0223 07:55:08.981921 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:55:08 crc kubenswrapper[4626]: E0223 07:55:08.982565 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:55:20 crc kubenswrapper[4626]: I0223 07:55:20.982393 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:55:20 crc kubenswrapper[4626]: E0223 07:55:20.983297 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:55:33 crc kubenswrapper[4626]: I0223 07:55:33.981669 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:55:33 crc kubenswrapper[4626]: E0223 07:55:33.982486 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:55:45 crc kubenswrapper[4626]: I0223 07:55:45.982890 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:55:45 crc kubenswrapper[4626]: E0223 07:55:45.983837 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:55:47 crc kubenswrapper[4626]: I0223 07:55:47.620263 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6465458495-hgsdz" podUID="434e199c-4e18-4274-bbaa-f81f2e2a697b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 23 07:55:57 crc kubenswrapper[4626]: I0223 07:55:57.989130 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:55:57 crc kubenswrapper[4626]: E0223 07:55:57.990143 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:56:09 crc kubenswrapper[4626]: I0223 07:56:09.983153 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:56:09 crc kubenswrapper[4626]: E0223 07:56:09.984294 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:56:20 crc kubenswrapper[4626]: I0223 07:56:20.982907 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:56:20 crc kubenswrapper[4626]: E0223 07:56:20.984945 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:56:35 crc kubenswrapper[4626]: I0223 07:56:35.982736 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:56:35 crc kubenswrapper[4626]: E0223 07:56:35.983888 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:56:50 crc kubenswrapper[4626]: I0223 07:56:50.983168 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:56:50 crc kubenswrapper[4626]: E0223 07:56:50.984354 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:57:05 crc kubenswrapper[4626]: I0223 07:57:05.982106 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:57:05 crc kubenswrapper[4626]: E0223 07:57:05.982976 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:57:17 crc kubenswrapper[4626]: I0223 07:57:17.987202 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:57:17 crc kubenswrapper[4626]: E0223 07:57:17.988093 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:57:30 crc kubenswrapper[4626]: I0223 07:57:30.982855 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:57:30 crc kubenswrapper[4626]: E0223 07:57:30.983974 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:57:43 crc kubenswrapper[4626]: I0223 07:57:43.983277 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:57:43 crc kubenswrapper[4626]: E0223 07:57:43.984037 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:57:57 crc kubenswrapper[4626]: I0223 07:57:57.988852 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:57:57 crc kubenswrapper[4626]: E0223 07:57:57.989543 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:58:10 crc kubenswrapper[4626]: I0223 07:58:10.982241 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:58:10 crc kubenswrapper[4626]: E0223 07:58:10.982876 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:58:23 crc kubenswrapper[4626]: I0223 07:58:23.982426 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:58:23 crc kubenswrapper[4626]: E0223 07:58:23.983642 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:58:36 crc kubenswrapper[4626]: I0223 07:58:36.982628 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:58:36 crc kubenswrapper[4626]: E0223 07:58:36.984805 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:58:50 crc kubenswrapper[4626]: I0223 07:58:50.983010 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:58:50 crc kubenswrapper[4626]: E0223 07:58:50.984165 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:59:05 crc kubenswrapper[4626]: I0223 07:59:05.982800 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:59:05 crc kubenswrapper[4626]: E0223 07:59:05.983916 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:59:18 crc kubenswrapper[4626]: I0223 07:59:18.982116 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:59:18 crc kubenswrapper[4626]: E0223 07:59:18.982975 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 07:59:32 crc kubenswrapper[4626]: I0223 07:59:32.982929 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 07:59:33 crc kubenswrapper[4626]: I0223 07:59:33.586361 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"e459a9ed2ad6229b1cb787ff55877ab11d2a5f9aa86208083379ca7a8a6dc7f0"} Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.200239 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv"] Feb 23 08:00:00 crc kubenswrapper[4626]: E0223 08:00:00.201289 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="extract-content" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.201306 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="extract-content" Feb 23 08:00:00 crc kubenswrapper[4626]: E0223 08:00:00.201348 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="registry-server" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.201354 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="registry-server" Feb 23 08:00:00 crc kubenswrapper[4626]: E0223 08:00:00.201366 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="extract-utilities" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.201371 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="extract-utilities" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.201649 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4122031e-f9b6-4efa-8596-bee15490b6cd" containerName="registry-server" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.202377 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.208111 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.208113 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.209902 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv"] Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.273405 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ac33a59-edb4-492c-a2fa-df4a119ade7b-secret-volume\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.273739 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chskp\" (UniqueName: \"kubernetes.io/projected/1ac33a59-edb4-492c-a2fa-df4a119ade7b-kube-api-access-chskp\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.273800 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ac33a59-edb4-492c-a2fa-df4a119ade7b-config-volume\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.376339 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ac33a59-edb4-492c-a2fa-df4a119ade7b-secret-volume\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.376536 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chskp\" (UniqueName: \"kubernetes.io/projected/1ac33a59-edb4-492c-a2fa-df4a119ade7b-kube-api-access-chskp\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.376570 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ac33a59-edb4-492c-a2fa-df4a119ade7b-config-volume\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.377686 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ac33a59-edb4-492c-a2fa-df4a119ade7b-config-volume\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.384843 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ac33a59-edb4-492c-a2fa-df4a119ade7b-secret-volume\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.394155 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chskp\" (UniqueName: \"kubernetes.io/projected/1ac33a59-edb4-492c-a2fa-df4a119ade7b-kube-api-access-chskp\") pod \"collect-profiles-29530560-pf2kv\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:00 crc kubenswrapper[4626]: I0223 08:00:00.519804 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:01 crc kubenswrapper[4626]: I0223 08:00:01.170129 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv"] Feb 23 08:00:01 crc kubenswrapper[4626]: I0223 08:00:01.840924 4626 generic.go:334] "Generic (PLEG): container finished" podID="1ac33a59-edb4-492c-a2fa-df4a119ade7b" containerID="50ffb83eace4b93db617b7fe5c2b16a1920fe316f2454b161c00738f95cf06f7" exitCode=0 Feb 23 08:00:01 crc kubenswrapper[4626]: I0223 08:00:01.841013 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" event={"ID":"1ac33a59-edb4-492c-a2fa-df4a119ade7b","Type":"ContainerDied","Data":"50ffb83eace4b93db617b7fe5c2b16a1920fe316f2454b161c00738f95cf06f7"} Feb 23 08:00:01 crc kubenswrapper[4626]: I0223 08:00:01.841602 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" event={"ID":"1ac33a59-edb4-492c-a2fa-df4a119ade7b","Type":"ContainerStarted","Data":"b3aae54fe405ad4da138e83472a4e0fa0182fb437a9a07595b329ff842788307"} Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.119899 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.144581 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chskp\" (UniqueName: \"kubernetes.io/projected/1ac33a59-edb4-492c-a2fa-df4a119ade7b-kube-api-access-chskp\") pod \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.144675 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ac33a59-edb4-492c-a2fa-df4a119ade7b-secret-volume\") pod \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.144764 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ac33a59-edb4-492c-a2fa-df4a119ade7b-config-volume\") pod \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\" (UID: \"1ac33a59-edb4-492c-a2fa-df4a119ade7b\") " Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.145815 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ac33a59-edb4-492c-a2fa-df4a119ade7b-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ac33a59-edb4-492c-a2fa-df4a119ade7b" (UID: "1ac33a59-edb4-492c-a2fa-df4a119ade7b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.154034 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac33a59-edb4-492c-a2fa-df4a119ade7b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ac33a59-edb4-492c-a2fa-df4a119ade7b" (UID: "1ac33a59-edb4-492c-a2fa-df4a119ade7b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.156711 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac33a59-edb4-492c-a2fa-df4a119ade7b-kube-api-access-chskp" (OuterVolumeSpecName: "kube-api-access-chskp") pod "1ac33a59-edb4-492c-a2fa-df4a119ade7b" (UID: "1ac33a59-edb4-492c-a2fa-df4a119ade7b"). InnerVolumeSpecName "kube-api-access-chskp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.249327 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chskp\" (UniqueName: \"kubernetes.io/projected/1ac33a59-edb4-492c-a2fa-df4a119ade7b-kube-api-access-chskp\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.249356 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ac33a59-edb4-492c-a2fa-df4a119ade7b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.249369 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ac33a59-edb4-492c-a2fa-df4a119ade7b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.861743 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" event={"ID":"1ac33a59-edb4-492c-a2fa-df4a119ade7b","Type":"ContainerDied","Data":"b3aae54fe405ad4da138e83472a4e0fa0182fb437a9a07595b329ff842788307"} Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.862170 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3aae54fe405ad4da138e83472a4e0fa0182fb437a9a07595b329ff842788307" Feb 23 08:00:03 crc kubenswrapper[4626]: I0223 08:00:03.861819 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv" Feb 23 08:00:04 crc kubenswrapper[4626]: I0223 08:00:04.195619 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2"] Feb 23 08:00:04 crc kubenswrapper[4626]: I0223 08:00:04.203809 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-27ph2"] Feb 23 08:00:05 crc kubenswrapper[4626]: I0223 08:00:05.993244 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c6d748-b917-4113-b932-b846717ebed2" path="/var/lib/kubelet/pods/13c6d748-b917-4113-b932-b846717ebed2/volumes" Feb 23 08:00:22 crc kubenswrapper[4626]: I0223 08:00:22.939893 4626 scope.go:117] "RemoveContainer" containerID="f3093b7741acda3a958bef1ba5e379cef5a42a8b7232ca0827fe2c1dbe15d8d9" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.020094 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xcz2q"] Feb 23 08:00:45 crc kubenswrapper[4626]: E0223 08:00:45.021268 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac33a59-edb4-492c-a2fa-df4a119ade7b" containerName="collect-profiles" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.021283 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac33a59-edb4-492c-a2fa-df4a119ade7b" containerName="collect-profiles" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.021539 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac33a59-edb4-492c-a2fa-df4a119ade7b" containerName="collect-profiles" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.023150 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.033121 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcz2q"] Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.060245 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-utilities\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.060810 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-kube-api-access-shmcj\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.060844 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.162466 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-utilities\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.162726 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-kube-api-access-shmcj\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.162755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.162921 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-utilities\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.163091 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.179706 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-kube-api-access-shmcj\") pod \"redhat-operators-xcz2q\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.340994 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:45 crc kubenswrapper[4626]: I0223 08:00:45.746713 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcz2q"] Feb 23 08:00:46 crc kubenswrapper[4626]: I0223 08:00:46.219112 4626 generic.go:334] "Generic (PLEG): container finished" podID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerID="948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec" exitCode=0 Feb 23 08:00:46 crc kubenswrapper[4626]: I0223 08:00:46.219329 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerDied","Data":"948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec"} Feb 23 08:00:46 crc kubenswrapper[4626]: I0223 08:00:46.219355 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerStarted","Data":"11cf76535a26dd5321b592c46e727d86d8796d91a71407b5b797f1a969c4fa3b"} Feb 23 08:00:46 crc kubenswrapper[4626]: I0223 08:00:46.221251 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:00:48 crc kubenswrapper[4626]: I0223 08:00:48.239592 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerStarted","Data":"53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046"} Feb 23 08:00:51 crc kubenswrapper[4626]: I0223 08:00:51.263929 4626 generic.go:334] "Generic (PLEG): container finished" podID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerID="53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046" exitCode=0 Feb 23 08:00:51 crc kubenswrapper[4626]: I0223 08:00:51.264302 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerDied","Data":"53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046"} Feb 23 08:00:52 crc kubenswrapper[4626]: I0223 08:00:52.275134 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerStarted","Data":"b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0"} Feb 23 08:00:52 crc kubenswrapper[4626]: I0223 08:00:52.300237 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xcz2q" podStartSLOduration=2.638327634 podStartE2EDuration="8.300216077s" podCreationTimestamp="2026-02-23 08:00:44 +0000 UTC" firstStartedPulling="2026-02-23 08:00:46.220999773 +0000 UTC m=+4798.560329040" lastFinishedPulling="2026-02-23 08:00:51.882888218 +0000 UTC m=+4804.222217483" observedRunningTime="2026-02-23 08:00:52.298564223 +0000 UTC m=+4804.637893490" watchObservedRunningTime="2026-02-23 08:00:52.300216077 +0000 UTC m=+4804.639545343" Feb 23 08:00:55 crc kubenswrapper[4626]: I0223 08:00:55.342148 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:55 crc kubenswrapper[4626]: I0223 08:00:55.342191 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:00:56 crc kubenswrapper[4626]: I0223 08:00:56.382454 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xcz2q" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="registry-server" probeResult="failure" output=< Feb 23 08:00:56 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:00:56 crc kubenswrapper[4626]: > Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.149427 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530561-fjctr"] Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.151566 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.163098 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530561-fjctr"] Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.258056 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4w6m\" (UniqueName: \"kubernetes.io/projected/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-kube-api-access-p4w6m\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.258325 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-config-data\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.258404 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-fernet-keys\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.258457 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-combined-ca-bundle\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.360481 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-config-data\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.360573 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-fernet-keys\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.360616 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-combined-ca-bundle\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.360859 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4w6m\" (UniqueName: \"kubernetes.io/projected/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-kube-api-access-p4w6m\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.369361 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-config-data\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.370162 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-fernet-keys\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.376129 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4w6m\" (UniqueName: \"kubernetes.io/projected/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-kube-api-access-p4w6m\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.385886 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-combined-ca-bundle\") pod \"keystone-cron-29530561-fjctr\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.472667 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:00 crc kubenswrapper[4626]: I0223 08:01:00.927640 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530561-fjctr"] Feb 23 08:01:01 crc kubenswrapper[4626]: I0223 08:01:01.365233 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530561-fjctr" event={"ID":"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b","Type":"ContainerStarted","Data":"960ff9db7c9bfda3f9c32c5fbcca15ebf671a1db078f7204c1f2c23583733e40"} Feb 23 08:01:01 crc kubenswrapper[4626]: I0223 08:01:01.366823 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530561-fjctr" event={"ID":"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b","Type":"ContainerStarted","Data":"d843a0039a2e9edbc874507254734ff1353592ab55f315cf2cadc605e3791def"} Feb 23 08:01:01 crc kubenswrapper[4626]: I0223 08:01:01.396370 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29530561-fjctr" podStartSLOduration=1.396354609 podStartE2EDuration="1.396354609s" podCreationTimestamp="2026-02-23 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:01:01.389774141 +0000 UTC m=+4813.729103407" watchObservedRunningTime="2026-02-23 08:01:01.396354609 +0000 UTC m=+4813.735683875" Feb 23 08:01:04 crc kubenswrapper[4626]: I0223 08:01:04.389890 4626 generic.go:334] "Generic (PLEG): container finished" podID="82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" containerID="960ff9db7c9bfda3f9c32c5fbcca15ebf671a1db078f7204c1f2c23583733e40" exitCode=0 Feb 23 08:01:04 crc kubenswrapper[4626]: I0223 08:01:04.389940 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530561-fjctr" event={"ID":"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b","Type":"ContainerDied","Data":"960ff9db7c9bfda3f9c32c5fbcca15ebf671a1db078f7204c1f2c23583733e40"} Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.766401 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.799098 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-fernet-keys\") pod \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.799600 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-config-data\") pod \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.799624 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-combined-ca-bundle\") pod \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.799684 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4w6m\" (UniqueName: \"kubernetes.io/projected/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-kube-api-access-p4w6m\") pod \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\" (UID: \"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b\") " Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.808958 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-kube-api-access-p4w6m" (OuterVolumeSpecName: "kube-api-access-p4w6m") pod "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" (UID: "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b"). InnerVolumeSpecName "kube-api-access-p4w6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.814275 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" (UID: "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.826547 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" (UID: "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.840113 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-config-data" (OuterVolumeSpecName: "config-data") pod "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" (UID: "82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.902493 4626 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.902544 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.902554 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:05 crc kubenswrapper[4626]: I0223 08:01:05.902567 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4w6m\" (UniqueName: \"kubernetes.io/projected/82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b-kube-api-access-p4w6m\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:06 crc kubenswrapper[4626]: I0223 08:01:06.375780 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xcz2q" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="registry-server" probeResult="failure" output=< Feb 23 08:01:06 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:01:06 crc kubenswrapper[4626]: > Feb 23 08:01:06 crc kubenswrapper[4626]: I0223 08:01:06.408249 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530561-fjctr" event={"ID":"82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b","Type":"ContainerDied","Data":"d843a0039a2e9edbc874507254734ff1353592ab55f315cf2cadc605e3791def"} Feb 23 08:01:06 crc kubenswrapper[4626]: I0223 08:01:06.408303 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d843a0039a2e9edbc874507254734ff1353592ab55f315cf2cadc605e3791def" Feb 23 08:01:06 crc kubenswrapper[4626]: I0223 08:01:06.408391 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530561-fjctr" Feb 23 08:01:15 crc kubenswrapper[4626]: I0223 08:01:15.389208 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:01:15 crc kubenswrapper[4626]: I0223 08:01:15.440201 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.220367 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcz2q"] Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.500125 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xcz2q" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="registry-server" containerID="cri-o://b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0" gracePeriod=2 Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.895143 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.958294 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content\") pod \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.958462 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-utilities\") pod \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.958631 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-kube-api-access-shmcj\") pod \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.958957 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-utilities" (OuterVolumeSpecName: "utilities") pod "1f51bb94-0d21-4f62-9e36-e2c9d3393bee" (UID: "1f51bb94-0d21-4f62-9e36-e2c9d3393bee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:01:16 crc kubenswrapper[4626]: I0223 08:01:16.976122 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-kube-api-access-shmcj" (OuterVolumeSpecName: "kube-api-access-shmcj") pod "1f51bb94-0d21-4f62-9e36-e2c9d3393bee" (UID: "1f51bb94-0d21-4f62-9e36-e2c9d3393bee"). InnerVolumeSpecName "kube-api-access-shmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.059256 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f51bb94-0d21-4f62-9e36-e2c9d3393bee" (UID: "1f51bb94-0d21-4f62-9e36-e2c9d3393bee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.059959 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content\") pod \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\" (UID: \"1f51bb94-0d21-4f62-9e36-e2c9d3393bee\") " Feb 23 08:01:17 crc kubenswrapper[4626]: W0223 08:01:17.061474 4626 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1f51bb94-0d21-4f62-9e36-e2c9d3393bee/volumes/kubernetes.io~empty-dir/catalog-content Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.061873 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f51bb94-0d21-4f62-9e36-e2c9d3393bee" (UID: "1f51bb94-0d21-4f62-9e36-e2c9d3393bee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.062888 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.062915 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shmcj\" (UniqueName: \"kubernetes.io/projected/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-kube-api-access-shmcj\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.062931 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f51bb94-0d21-4f62-9e36-e2c9d3393bee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.510614 4626 generic.go:334] "Generic (PLEG): container finished" podID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerID="b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0" exitCode=0 Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.510703 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerDied","Data":"b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0"} Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.510988 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcz2q" event={"ID":"1f51bb94-0d21-4f62-9e36-e2c9d3393bee","Type":"ContainerDied","Data":"11cf76535a26dd5321b592c46e727d86d8796d91a71407b5b797f1a969c4fa3b"} Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.511017 4626 scope.go:117] "RemoveContainer" containerID="b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.510708 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcz2q" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.539950 4626 scope.go:117] "RemoveContainer" containerID="53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.554552 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcz2q"] Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.575189 4626 scope.go:117] "RemoveContainer" containerID="948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.575391 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xcz2q"] Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.641046 4626 scope.go:117] "RemoveContainer" containerID="b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0" Feb 23 08:01:17 crc kubenswrapper[4626]: E0223 08:01:17.652545 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0\": container with ID starting with b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0 not found: ID does not exist" containerID="b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.652615 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0"} err="failed to get container status \"b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0\": rpc error: code = NotFound desc = could not find container \"b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0\": container with ID starting with b7a01ab252889bb24845a0e2ce79023861dc4c6ce61cdd621bca12a9f6d71ae0 not found: ID does not exist" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.652651 4626 scope.go:117] "RemoveContainer" containerID="53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046" Feb 23 08:01:17 crc kubenswrapper[4626]: E0223 08:01:17.656908 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046\": container with ID starting with 53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046 not found: ID does not exist" containerID="53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.656950 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046"} err="failed to get container status \"53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046\": rpc error: code = NotFound desc = could not find container \"53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046\": container with ID starting with 53664950d3c8741f0eb7ae7df844c4dd4647eaad2135de32b65ded53a35f8046 not found: ID does not exist" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.656973 4626 scope.go:117] "RemoveContainer" containerID="948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec" Feb 23 08:01:17 crc kubenswrapper[4626]: E0223 08:01:17.659557 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec\": container with ID starting with 948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec not found: ID does not exist" containerID="948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.659596 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec"} err="failed to get container status \"948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec\": rpc error: code = NotFound desc = could not find container \"948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec\": container with ID starting with 948e7c53340fd21f796347dd891249d95328da6bb88601649d2eddefd7bacaec not found: ID does not exist" Feb 23 08:01:17 crc kubenswrapper[4626]: I0223 08:01:17.990672 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" path="/var/lib/kubelet/pods/1f51bb94-0d21-4f62-9e36-e2c9d3393bee/volumes" Feb 23 08:01:55 crc kubenswrapper[4626]: I0223 08:01:55.685425 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:01:55 crc kubenswrapper[4626]: I0223 08:01:55.686256 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.515013 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6fll"] Feb 23 08:02:10 crc kubenswrapper[4626]: E0223 08:02:10.516430 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="extract-utilities" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.516457 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="extract-utilities" Feb 23 08:02:10 crc kubenswrapper[4626]: E0223 08:02:10.516584 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="registry-server" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.516737 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="registry-server" Feb 23 08:02:10 crc kubenswrapper[4626]: E0223 08:02:10.516752 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="extract-content" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.516760 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="extract-content" Feb 23 08:02:10 crc kubenswrapper[4626]: E0223 08:02:10.516774 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" containerName="keystone-cron" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.516782 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" containerName="keystone-cron" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.524073 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f51bb94-0d21-4f62-9e36-e2c9d3393bee" containerName="registry-server" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.524123 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b" containerName="keystone-cron" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.526299 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.529234 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6fll"] Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.610077 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-catalog-content\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.610309 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-249r2\" (UniqueName: \"kubernetes.io/projected/8d4c68dc-066d-4cc1-8312-55cbc7a01277-kube-api-access-249r2\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.610467 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-utilities\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.712552 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-catalog-content\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.712755 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-249r2\" (UniqueName: \"kubernetes.io/projected/8d4c68dc-066d-4cc1-8312-55cbc7a01277-kube-api-access-249r2\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.712894 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-utilities\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.713023 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-catalog-content\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.713275 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-utilities\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.732338 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-249r2\" (UniqueName: \"kubernetes.io/projected/8d4c68dc-066d-4cc1-8312-55cbc7a01277-kube-api-access-249r2\") pod \"community-operators-j6fll\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:10 crc kubenswrapper[4626]: I0223 08:02:10.853019 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:11 crc kubenswrapper[4626]: I0223 08:02:11.400141 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6fll"] Feb 23 08:02:11 crc kubenswrapper[4626]: I0223 08:02:11.999062 4626 generic.go:334] "Generic (PLEG): container finished" podID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerID="ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8" exitCode=0 Feb 23 08:02:11 crc kubenswrapper[4626]: I0223 08:02:11.999131 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerDied","Data":"ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8"} Feb 23 08:02:12 crc kubenswrapper[4626]: I0223 08:02:11.999199 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerStarted","Data":"41a4a85da1569fa72a73e9fd4e9016269ea50d5681b954162f021c69b0ab15f4"} Feb 23 08:02:13 crc kubenswrapper[4626]: I0223 08:02:13.009345 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerStarted","Data":"494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff"} Feb 23 08:02:14 crc kubenswrapper[4626]: I0223 08:02:14.022228 4626 generic.go:334] "Generic (PLEG): container finished" podID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerID="494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff" exitCode=0 Feb 23 08:02:14 crc kubenswrapper[4626]: I0223 08:02:14.022299 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerDied","Data":"494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff"} Feb 23 08:02:15 crc kubenswrapper[4626]: I0223 08:02:15.033079 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerStarted","Data":"a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0"} Feb 23 08:02:15 crc kubenswrapper[4626]: I0223 08:02:15.058452 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6fll" podStartSLOduration=2.543763218 podStartE2EDuration="5.05843376s" podCreationTimestamp="2026-02-23 08:02:10 +0000 UTC" firstStartedPulling="2026-02-23 08:02:12.000731102 +0000 UTC m=+4884.340060369" lastFinishedPulling="2026-02-23 08:02:14.515401646 +0000 UTC m=+4886.854730911" observedRunningTime="2026-02-23 08:02:15.051469179 +0000 UTC m=+4887.390798446" watchObservedRunningTime="2026-02-23 08:02:15.05843376 +0000 UTC m=+4887.397763026" Feb 23 08:02:20 crc kubenswrapper[4626]: I0223 08:02:20.853999 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:20 crc kubenswrapper[4626]: I0223 08:02:20.854675 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:20 crc kubenswrapper[4626]: I0223 08:02:20.895719 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:21 crc kubenswrapper[4626]: I0223 08:02:21.134545 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:21 crc kubenswrapper[4626]: I0223 08:02:21.179248 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6fll"] Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.118986 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6fll" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="registry-server" containerID="cri-o://a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0" gracePeriod=2 Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.774439 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.956325 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249r2\" (UniqueName: \"kubernetes.io/projected/8d4c68dc-066d-4cc1-8312-55cbc7a01277-kube-api-access-249r2\") pod \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.956635 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-catalog-content\") pod \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.956848 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-utilities\") pod \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\" (UID: \"8d4c68dc-066d-4cc1-8312-55cbc7a01277\") " Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.958097 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-utilities" (OuterVolumeSpecName: "utilities") pod "8d4c68dc-066d-4cc1-8312-55cbc7a01277" (UID: "8d4c68dc-066d-4cc1-8312-55cbc7a01277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:02:23 crc kubenswrapper[4626]: I0223 08:02:23.965351 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4c68dc-066d-4cc1-8312-55cbc7a01277-kube-api-access-249r2" (OuterVolumeSpecName: "kube-api-access-249r2") pod "8d4c68dc-066d-4cc1-8312-55cbc7a01277" (UID: "8d4c68dc-066d-4cc1-8312-55cbc7a01277"). InnerVolumeSpecName "kube-api-access-249r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.006109 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d4c68dc-066d-4cc1-8312-55cbc7a01277" (UID: "8d4c68dc-066d-4cc1-8312-55cbc7a01277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.058983 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249r2\" (UniqueName: \"kubernetes.io/projected/8d4c68dc-066d-4cc1-8312-55cbc7a01277-kube-api-access-249r2\") on node \"crc\" DevicePath \"\"" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.059011 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.059021 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d4c68dc-066d-4cc1-8312-55cbc7a01277-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.129509 4626 generic.go:334] "Generic (PLEG): container finished" podID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerID="a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0" exitCode=0 Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.129552 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerDied","Data":"a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0"} Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.129589 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6fll" event={"ID":"8d4c68dc-066d-4cc1-8312-55cbc7a01277","Type":"ContainerDied","Data":"41a4a85da1569fa72a73e9fd4e9016269ea50d5681b954162f021c69b0ab15f4"} Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.129611 4626 scope.go:117] "RemoveContainer" containerID="a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.129661 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6fll" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.161639 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6fll"] Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.162411 4626 scope.go:117] "RemoveContainer" containerID="494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.173311 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6fll"] Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.180251 4626 scope.go:117] "RemoveContainer" containerID="ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.215715 4626 scope.go:117] "RemoveContainer" containerID="a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0" Feb 23 08:02:24 crc kubenswrapper[4626]: E0223 08:02:24.216166 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0\": container with ID starting with a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0 not found: ID does not exist" containerID="a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.216208 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0"} err="failed to get container status \"a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0\": rpc error: code = NotFound desc = could not find container \"a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0\": container with ID starting with a4eff4a0ab624177564eaf69ca572c18b4cb73dc300bf9307a2a3d7e0cf0cec0 not found: ID does not exist" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.216238 4626 scope.go:117] "RemoveContainer" containerID="494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff" Feb 23 08:02:24 crc kubenswrapper[4626]: E0223 08:02:24.216752 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff\": container with ID starting with 494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff not found: ID does not exist" containerID="494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.216787 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff"} err="failed to get container status \"494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff\": rpc error: code = NotFound desc = could not find container \"494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff\": container with ID starting with 494c2fff64e909218dd1ddb7fa2bf6a8855fae86c74c34c0f87f8b2816ad63ff not found: ID does not exist" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.216809 4626 scope.go:117] "RemoveContainer" containerID="ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8" Feb 23 08:02:24 crc kubenswrapper[4626]: E0223 08:02:24.217086 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8\": container with ID starting with ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8 not found: ID does not exist" containerID="ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8" Feb 23 08:02:24 crc kubenswrapper[4626]: I0223 08:02:24.217380 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8"} err="failed to get container status \"ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8\": rpc error: code = NotFound desc = could not find container \"ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8\": container with ID starting with ff22972d84f47b7b639214fc308b5db73888843f3c47ec57706002c12e5f7da8 not found: ID does not exist" Feb 23 08:02:25 crc kubenswrapper[4626]: I0223 08:02:25.685281 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:02:25 crc kubenswrapper[4626]: I0223 08:02:25.685675 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:02:25 crc kubenswrapper[4626]: I0223 08:02:25.990850 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" path="/var/lib/kubelet/pods/8d4c68dc-066d-4cc1-8312-55cbc7a01277/volumes" Feb 23 08:02:55 crc kubenswrapper[4626]: I0223 08:02:55.685287 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:02:55 crc kubenswrapper[4626]: I0223 08:02:55.685768 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:02:55 crc kubenswrapper[4626]: I0223 08:02:55.685817 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:02:55 crc kubenswrapper[4626]: I0223 08:02:55.686455 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e459a9ed2ad6229b1cb787ff55877ab11d2a5f9aa86208083379ca7a8a6dc7f0"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:02:55 crc kubenswrapper[4626]: I0223 08:02:55.686533 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://e459a9ed2ad6229b1cb787ff55877ab11d2a5f9aa86208083379ca7a8a6dc7f0" gracePeriod=600 Feb 23 08:02:56 crc kubenswrapper[4626]: I0223 08:02:56.400225 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="e459a9ed2ad6229b1cb787ff55877ab11d2a5f9aa86208083379ca7a8a6dc7f0" exitCode=0 Feb 23 08:02:56 crc kubenswrapper[4626]: I0223 08:02:56.400325 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"e459a9ed2ad6229b1cb787ff55877ab11d2a5f9aa86208083379ca7a8a6dc7f0"} Feb 23 08:02:56 crc kubenswrapper[4626]: I0223 08:02:56.400714 4626 scope.go:117] "RemoveContainer" containerID="e577c440bfa5a9e4489ead64fef1c5024bdc27be69913213f66cd1a1981d2d4c" Feb 23 08:02:57 crc kubenswrapper[4626]: I0223 08:02:57.417031 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b"} Feb 23 08:04:45 crc kubenswrapper[4626]: I0223 08:04:45.443537 4626 generic.go:334] "Generic (PLEG): container finished" podID="71976b26-a20d-4173-98a9-e4d5b553fb8b" containerID="ce3770cf9d1d69675d9892d368761375d5dc5c0d146efc01f1cbe09d7d308a9b" exitCode=0 Feb 23 08:04:45 crc kubenswrapper[4626]: I0223 08:04:45.443607 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"71976b26-a20d-4173-98a9-e4d5b553fb8b","Type":"ContainerDied","Data":"ce3770cf9d1d69675d9892d368761375d5dc5c0d146efc01f1cbe09d7d308a9b"} Feb 23 08:04:46 crc kubenswrapper[4626]: I0223 08:04:46.928083 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.036647 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Feb 23 08:04:47 crc kubenswrapper[4626]: E0223 08:04:47.037400 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="extract-utilities" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.037420 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="extract-utilities" Feb 23 08:04:47 crc kubenswrapper[4626]: E0223 08:04:47.037436 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="extract-content" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.037443 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="extract-content" Feb 23 08:04:47 crc kubenswrapper[4626]: E0223 08:04:47.037459 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="registry-server" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.037464 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="registry-server" Feb 23 08:04:47 crc kubenswrapper[4626]: E0223 08:04:47.037485 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71976b26-a20d-4173-98a9-e4d5b553fb8b" containerName="tempest-tests-tempest-tests-runner" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.037490 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="71976b26-a20d-4173-98a9-e4d5b553fb8b" containerName="tempest-tests-tempest-tests-runner" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.037758 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4c68dc-066d-4cc1-8312-55cbc7a01277" containerName="registry-server" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.037778 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="71976b26-a20d-4173-98a9-e4d5b553fb8b" containerName="tempest-tests-tempest-tests-runner" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.038590 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.042019 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnvws\" (UniqueName: \"kubernetes.io/projected/71976b26-a20d-4173-98a9-e4d5b553fb8b-kube-api-access-qnvws\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.042065 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.042095 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-config-data\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.042911 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-config-data" (OuterVolumeSpecName: "config-data") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043023 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043144 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043366 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config-secret\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043680 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-temporary\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043851 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ca-certs\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043877 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ssh-key\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.043944 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-workdir\") pod \"71976b26-a20d-4173-98a9-e4d5b553fb8b\" (UID: \"71976b26-a20d-4173-98a9-e4d5b553fb8b\") " Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.044955 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.046763 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.049334 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.050926 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71976b26-a20d-4173-98a9-e4d5b553fb8b-kube-api-access-qnvws" (OuterVolumeSpecName: "kube-api-access-qnvws") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "kube-api-access-qnvws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.052832 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.056658 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.081721 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.098197 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.105917 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.106241 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.117552 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "71976b26-a20d-4173-98a9-e4d5b553fb8b" (UID: "71976b26-a20d-4173-98a9-e4d5b553fb8b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.146936 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147003 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147134 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147245 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdkj\" (UniqueName: \"kubernetes.io/projected/4774892a-4776-4db1-b74e-d78df11aa97e-kube-api-access-5bdkj\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147314 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147488 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147590 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147649 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147672 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147747 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnvws\" (UniqueName: \"kubernetes.io/projected/71976b26-a20d-4173-98a9-e4d5b553fb8b-kube-api-access-qnvws\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147761 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147773 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147820 4626 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147844 4626 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147856 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71976b26-a20d-4173-98a9-e4d5b553fb8b-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.147868 4626 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/71976b26-a20d-4173-98a9-e4d5b553fb8b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.171437 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249009 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdkj\" (UniqueName: \"kubernetes.io/projected/4774892a-4776-4db1-b74e-d78df11aa97e-kube-api-access-5bdkj\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249048 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249125 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249157 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249174 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249191 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249213 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.249274 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.250141 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.250385 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.250687 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.250888 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.252621 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.253233 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.253317 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.262474 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdkj\" (UniqueName: \"kubernetes.io/projected/4774892a-4776-4db1-b74e-d78df11aa97e-kube-api-access-5bdkj\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.455209 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.460038 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"71976b26-a20d-4173-98a9-e4d5b553fb8b","Type":"ContainerDied","Data":"cde27dfb87b9b97aa4139bda27c86b3c974fd5f271939f111022754cb8b0ccc1"} Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.460085 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cde27dfb87b9b97aa4139bda27c86b3c974fd5f271939f111022754cb8b0ccc1" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.460093 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 23 08:04:47 crc kubenswrapper[4626]: I0223 08:04:47.800918 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Feb 23 08:04:48 crc kubenswrapper[4626]: I0223 08:04:48.468857 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"4774892a-4776-4db1-b74e-d78df11aa97e","Type":"ContainerStarted","Data":"fe7054b3686175475873f70259ed650445a8d1382a2a1031820607136a312904"} Feb 23 08:04:49 crc kubenswrapper[4626]: I0223 08:04:49.478270 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"4774892a-4776-4db1-b74e-d78df11aa97e","Type":"ContainerStarted","Data":"7ad08357352f67f88732dd4c1e7f11901676a755054b59f42eb83b2c913b3c20"} Feb 23 08:04:49 crc kubenswrapper[4626]: I0223 08:04:49.496277 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" podStartSLOduration=2.4962622850000002 podStartE2EDuration="2.496262285s" podCreationTimestamp="2026-02-23 08:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:04:49.495939737 +0000 UTC m=+5041.835269003" watchObservedRunningTime="2026-02-23 08:04:49.496262285 +0000 UTC m=+5041.835591551" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.290955 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bfhfq"] Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.293476 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.299751 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfhfq"] Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.467226 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-utilities\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.467534 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-catalog-content\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.469082 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwp9m\" (UniqueName: \"kubernetes.io/projected/6a650894-8096-473e-b272-d64b327b6fd8-kube-api-access-fwp9m\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.571233 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-utilities\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.571397 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-catalog-content\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.571612 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwp9m\" (UniqueName: \"kubernetes.io/projected/6a650894-8096-473e-b272-d64b327b6fd8-kube-api-access-fwp9m\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.571829 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-catalog-content\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.572055 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-utilities\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.901962 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwp9m\" (UniqueName: \"kubernetes.io/projected/6a650894-8096-473e-b272-d64b327b6fd8-kube-api-access-fwp9m\") pod \"redhat-marketplace-bfhfq\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:52 crc kubenswrapper[4626]: I0223 08:04:52.920978 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:04:53 crc kubenswrapper[4626]: I0223 08:04:53.366237 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfhfq"] Feb 23 08:04:53 crc kubenswrapper[4626]: W0223 08:04:53.382122 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a650894_8096_473e_b272_d64b327b6fd8.slice/crio-a94587417a627782e074cb017bd2f0c6821b0900fae2075564151d0571249a49 WatchSource:0}: Error finding container a94587417a627782e074cb017bd2f0c6821b0900fae2075564151d0571249a49: Status 404 returned error can't find the container with id a94587417a627782e074cb017bd2f0c6821b0900fae2075564151d0571249a49 Feb 23 08:04:53 crc kubenswrapper[4626]: I0223 08:04:53.518511 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerStarted","Data":"a94587417a627782e074cb017bd2f0c6821b0900fae2075564151d0571249a49"} Feb 23 08:04:54 crc kubenswrapper[4626]: I0223 08:04:54.531314 4626 generic.go:334] "Generic (PLEG): container finished" podID="6a650894-8096-473e-b272-d64b327b6fd8" containerID="e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1" exitCode=0 Feb 23 08:04:54 crc kubenswrapper[4626]: I0223 08:04:54.531355 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerDied","Data":"e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1"} Feb 23 08:04:55 crc kubenswrapper[4626]: I0223 08:04:55.541680 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerStarted","Data":"cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701"} Feb 23 08:04:56 crc kubenswrapper[4626]: I0223 08:04:56.553197 4626 generic.go:334] "Generic (PLEG): container finished" podID="6a650894-8096-473e-b272-d64b327b6fd8" containerID="cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701" exitCode=0 Feb 23 08:04:56 crc kubenswrapper[4626]: I0223 08:04:56.553258 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerDied","Data":"cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701"} Feb 23 08:04:57 crc kubenswrapper[4626]: I0223 08:04:57.568851 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerStarted","Data":"c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7"} Feb 23 08:04:57 crc kubenswrapper[4626]: I0223 08:04:57.590206 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bfhfq" podStartSLOduration=3.069853342 podStartE2EDuration="5.590192722s" podCreationTimestamp="2026-02-23 08:04:52 +0000 UTC" firstStartedPulling="2026-02-23 08:04:54.534570786 +0000 UTC m=+5046.873900052" lastFinishedPulling="2026-02-23 08:04:57.054910167 +0000 UTC m=+5049.394239432" observedRunningTime="2026-02-23 08:04:57.586195105 +0000 UTC m=+5049.925524371" watchObservedRunningTime="2026-02-23 08:04:57.590192722 +0000 UTC m=+5049.929521988" Feb 23 08:05:02 crc kubenswrapper[4626]: I0223 08:05:02.922568 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:05:02 crc kubenswrapper[4626]: I0223 08:05:02.923405 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:05:02 crc kubenswrapper[4626]: I0223 08:05:02.971777 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:05:03 crc kubenswrapper[4626]: I0223 08:05:03.674719 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:05:03 crc kubenswrapper[4626]: I0223 08:05:03.749820 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfhfq"] Feb 23 08:05:05 crc kubenswrapper[4626]: I0223 08:05:05.646956 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bfhfq" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="registry-server" containerID="cri-o://c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7" gracePeriod=2 Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.094418 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.217166 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-catalog-content\") pod \"6a650894-8096-473e-b272-d64b327b6fd8\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.221055 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-utilities\") pod \"6a650894-8096-473e-b272-d64b327b6fd8\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.221266 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwp9m\" (UniqueName: \"kubernetes.io/projected/6a650894-8096-473e-b272-d64b327b6fd8-kube-api-access-fwp9m\") pod \"6a650894-8096-473e-b272-d64b327b6fd8\" (UID: \"6a650894-8096-473e-b272-d64b327b6fd8\") " Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.221590 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-utilities" (OuterVolumeSpecName: "utilities") pod "6a650894-8096-473e-b272-d64b327b6fd8" (UID: "6a650894-8096-473e-b272-d64b327b6fd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.225540 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.239857 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a650894-8096-473e-b272-d64b327b6fd8-kube-api-access-fwp9m" (OuterVolumeSpecName: "kube-api-access-fwp9m") pod "6a650894-8096-473e-b272-d64b327b6fd8" (UID: "6a650894-8096-473e-b272-d64b327b6fd8"). InnerVolumeSpecName "kube-api-access-fwp9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.240598 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a650894-8096-473e-b272-d64b327b6fd8" (UID: "6a650894-8096-473e-b272-d64b327b6fd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.328482 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a650894-8096-473e-b272-d64b327b6fd8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.328534 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwp9m\" (UniqueName: \"kubernetes.io/projected/6a650894-8096-473e-b272-d64b327b6fd8-kube-api-access-fwp9m\") on node \"crc\" DevicePath \"\"" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.662130 4626 generic.go:334] "Generic (PLEG): container finished" podID="6a650894-8096-473e-b272-d64b327b6fd8" containerID="c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7" exitCode=0 Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.662181 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerDied","Data":"c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7"} Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.662215 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfhfq" event={"ID":"6a650894-8096-473e-b272-d64b327b6fd8","Type":"ContainerDied","Data":"a94587417a627782e074cb017bd2f0c6821b0900fae2075564151d0571249a49"} Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.662218 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfhfq" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.662233 4626 scope.go:117] "RemoveContainer" containerID="c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.697874 4626 scope.go:117] "RemoveContainer" containerID="cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.710306 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfhfq"] Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.718028 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfhfq"] Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.726326 4626 scope.go:117] "RemoveContainer" containerID="e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.761865 4626 scope.go:117] "RemoveContainer" containerID="c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7" Feb 23 08:05:06 crc kubenswrapper[4626]: E0223 08:05:06.762704 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7\": container with ID starting with c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7 not found: ID does not exist" containerID="c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.762744 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7"} err="failed to get container status \"c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7\": rpc error: code = NotFound desc = could not find container \"c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7\": container with ID starting with c80d8b8addbae6be7cefa7d2aa4a5e4ea5851344a902b273449aeb0b80a206b7 not found: ID does not exist" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.762775 4626 scope.go:117] "RemoveContainer" containerID="cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701" Feb 23 08:05:06 crc kubenswrapper[4626]: E0223 08:05:06.763118 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701\": container with ID starting with cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701 not found: ID does not exist" containerID="cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.763143 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701"} err="failed to get container status \"cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701\": rpc error: code = NotFound desc = could not find container \"cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701\": container with ID starting with cf92c81fcbdb6791cd3c561ab029f080a99099d2293719d18521c0d6ff96c701 not found: ID does not exist" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.763175 4626 scope.go:117] "RemoveContainer" containerID="e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1" Feb 23 08:05:06 crc kubenswrapper[4626]: E0223 08:05:06.763534 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1\": container with ID starting with e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1 not found: ID does not exist" containerID="e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1" Feb 23 08:05:06 crc kubenswrapper[4626]: I0223 08:05:06.763557 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1"} err="failed to get container status \"e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1\": rpc error: code = NotFound desc = could not find container \"e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1\": container with ID starting with e8d34e8013a8c123544161f54f7fd5680773371d1917494dac16b4d74e8ac9f1 not found: ID does not exist" Feb 23 08:05:07 crc kubenswrapper[4626]: I0223 08:05:07.991039 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a650894-8096-473e-b272-d64b327b6fd8" path="/var/lib/kubelet/pods/6a650894-8096-473e-b272-d64b327b6fd8/volumes" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.870858 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2m9f7"] Feb 23 08:05:11 crc kubenswrapper[4626]: E0223 08:05:11.873116 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="registry-server" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.873271 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="registry-server" Feb 23 08:05:11 crc kubenswrapper[4626]: E0223 08:05:11.873385 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="extract-utilities" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.873462 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="extract-utilities" Feb 23 08:05:11 crc kubenswrapper[4626]: E0223 08:05:11.873544 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="extract-content" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.873634 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="extract-content" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.873914 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a650894-8096-473e-b272-d64b327b6fd8" containerName="registry-server" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.875574 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.897116 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2m9f7"] Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.957070 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-catalog-content\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.957431 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ps9q\" (UniqueName: \"kubernetes.io/projected/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-kube-api-access-6ps9q\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:11 crc kubenswrapper[4626]: I0223 08:05:11.957485 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-utilities\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.059412 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-utilities\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.059877 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-utilities\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.060104 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-catalog-content\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.060421 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-catalog-content\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.060806 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ps9q\" (UniqueName: \"kubernetes.io/projected/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-kube-api-access-6ps9q\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.100174 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ps9q\" (UniqueName: \"kubernetes.io/projected/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-kube-api-access-6ps9q\") pod \"certified-operators-2m9f7\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.196103 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:12 crc kubenswrapper[4626]: I0223 08:05:12.700938 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2m9f7"] Feb 23 08:05:13 crc kubenswrapper[4626]: W0223 08:05:13.099396 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7046b50_3bb7_4a8c_a5cf_e1acd86661a9.slice/crio-abba131090e7c4a5e45bb70b744e46598752e6975e3979e0d0e2f626c80a0ef2 WatchSource:0}: Error finding container abba131090e7c4a5e45bb70b744e46598752e6975e3979e0d0e2f626c80a0ef2: Status 404 returned error can't find the container with id abba131090e7c4a5e45bb70b744e46598752e6975e3979e0d0e2f626c80a0ef2 Feb 23 08:05:13 crc kubenswrapper[4626]: I0223 08:05:13.738298 4626 generic.go:334] "Generic (PLEG): container finished" podID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerID="63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc" exitCode=0 Feb 23 08:05:13 crc kubenswrapper[4626]: I0223 08:05:13.738594 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerDied","Data":"63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc"} Feb 23 08:05:13 crc kubenswrapper[4626]: I0223 08:05:13.738622 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerStarted","Data":"abba131090e7c4a5e45bb70b744e46598752e6975e3979e0d0e2f626c80a0ef2"} Feb 23 08:05:14 crc kubenswrapper[4626]: I0223 08:05:14.750042 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerStarted","Data":"9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c"} Feb 23 08:05:16 crc kubenswrapper[4626]: I0223 08:05:16.774078 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerDied","Data":"9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c"} Feb 23 08:05:16 crc kubenswrapper[4626]: I0223 08:05:16.773998 4626 generic.go:334] "Generic (PLEG): container finished" podID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerID="9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c" exitCode=0 Feb 23 08:05:17 crc kubenswrapper[4626]: I0223 08:05:17.803189 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerStarted","Data":"ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5"} Feb 23 08:05:17 crc kubenswrapper[4626]: I0223 08:05:17.828622 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2m9f7" podStartSLOduration=3.244189731 podStartE2EDuration="6.828598386s" podCreationTimestamp="2026-02-23 08:05:11 +0000 UTC" firstStartedPulling="2026-02-23 08:05:13.740664261 +0000 UTC m=+5066.079993527" lastFinishedPulling="2026-02-23 08:05:17.325072916 +0000 UTC m=+5069.664402182" observedRunningTime="2026-02-23 08:05:17.819628423 +0000 UTC m=+5070.158957688" watchObservedRunningTime="2026-02-23 08:05:17.828598386 +0000 UTC m=+5070.167927652" Feb 23 08:05:22 crc kubenswrapper[4626]: I0223 08:05:22.196381 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:22 crc kubenswrapper[4626]: I0223 08:05:22.197468 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:22 crc kubenswrapper[4626]: I0223 08:05:22.322024 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:22 crc kubenswrapper[4626]: I0223 08:05:22.888263 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:23 crc kubenswrapper[4626]: I0223 08:05:23.491204 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2m9f7"] Feb 23 08:05:24 crc kubenswrapper[4626]: I0223 08:05:24.865012 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2m9f7" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="registry-server" containerID="cri-o://ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5" gracePeriod=2 Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.295464 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.385686 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ps9q\" (UniqueName: \"kubernetes.io/projected/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-kube-api-access-6ps9q\") pod \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.385993 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-catalog-content\") pod \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.386193 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-utilities\") pod \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\" (UID: \"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9\") " Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.387027 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-utilities" (OuterVolumeSpecName: "utilities") pod "a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" (UID: "a7046b50-3bb7-4a8c-a5cf-e1acd86661a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.399617 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-kube-api-access-6ps9q" (OuterVolumeSpecName: "kube-api-access-6ps9q") pod "a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" (UID: "a7046b50-3bb7-4a8c-a5cf-e1acd86661a9"). InnerVolumeSpecName "kube-api-access-6ps9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.426816 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" (UID: "a7046b50-3bb7-4a8c-a5cf-e1acd86661a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.489597 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.489641 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.489653 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ps9q\" (UniqueName: \"kubernetes.io/projected/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9-kube-api-access-6ps9q\") on node \"crc\" DevicePath \"\"" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.685276 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.685337 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.873262 4626 generic.go:334] "Generic (PLEG): container finished" podID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerID="ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5" exitCode=0 Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.873309 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerDied","Data":"ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5"} Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.873324 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2m9f7" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.873343 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2m9f7" event={"ID":"a7046b50-3bb7-4a8c-a5cf-e1acd86661a9","Type":"ContainerDied","Data":"abba131090e7c4a5e45bb70b744e46598752e6975e3979e0d0e2f626c80a0ef2"} Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.873361 4626 scope.go:117] "RemoveContainer" containerID="ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.906930 4626 scope.go:117] "RemoveContainer" containerID="9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.909236 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2m9f7"] Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.920733 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2m9f7"] Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.936716 4626 scope.go:117] "RemoveContainer" containerID="63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.956592 4626 scope.go:117] "RemoveContainer" containerID="ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5" Feb 23 08:05:25 crc kubenswrapper[4626]: E0223 08:05:25.956919 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5\": container with ID starting with ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5 not found: ID does not exist" containerID="ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.956951 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5"} err="failed to get container status \"ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5\": rpc error: code = NotFound desc = could not find container \"ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5\": container with ID starting with ccbb8f881840727e4098edae9c00a3fe22216159a87329b835a29c2c20a9d5f5 not found: ID does not exist" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.956973 4626 scope.go:117] "RemoveContainer" containerID="9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c" Feb 23 08:05:25 crc kubenswrapper[4626]: E0223 08:05:25.957195 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c\": container with ID starting with 9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c not found: ID does not exist" containerID="9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.957256 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c"} err="failed to get container status \"9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c\": rpc error: code = NotFound desc = could not find container \"9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c\": container with ID starting with 9d4ec13563ccc28e96f01b1a0e4ef7667655afc11af9ca518dc6fcee2064b71c not found: ID does not exist" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.957271 4626 scope.go:117] "RemoveContainer" containerID="63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc" Feb 23 08:05:25 crc kubenswrapper[4626]: E0223 08:05:25.957494 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc\": container with ID starting with 63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc not found: ID does not exist" containerID="63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.957530 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc"} err="failed to get container status \"63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc\": rpc error: code = NotFound desc = could not find container \"63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc\": container with ID starting with 63a9ac297f48f4ad2f72a90dfc32c503f808c26ede54c13552feaebcdb1004dc not found: ID does not exist" Feb 23 08:05:25 crc kubenswrapper[4626]: I0223 08:05:25.995762 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" path="/var/lib/kubelet/pods/a7046b50-3bb7-4a8c-a5cf-e1acd86661a9/volumes" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.893059 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79dfdd6449-flhwk"] Feb 23 08:05:41 crc kubenswrapper[4626]: E0223 08:05:41.893908 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="registry-server" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.893922 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="registry-server" Feb 23 08:05:41 crc kubenswrapper[4626]: E0223 08:05:41.893941 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="extract-utilities" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.893948 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="extract-utilities" Feb 23 08:05:41 crc kubenswrapper[4626]: E0223 08:05:41.893979 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="extract-content" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.893985 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="extract-content" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.894191 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7046b50-3bb7-4a8c-a5cf-e1acd86661a9" containerName="registry-server" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.895141 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:41 crc kubenswrapper[4626]: I0223 08:05:41.951627 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79dfdd6449-flhwk"] Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014567 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2fn7\" (UniqueName: \"kubernetes.io/projected/058dce91-7662-43cb-bcda-d7d44e59029b-kube-api-access-d2fn7\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014674 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-httpd-config\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014713 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-internal-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014855 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014893 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-combined-ca-bundle\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014935 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-config\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.014992 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-ovndb-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.119179 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2fn7\" (UniqueName: \"kubernetes.io/projected/058dce91-7662-43cb-bcda-d7d44e59029b-kube-api-access-d2fn7\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.119434 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-httpd-config\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.119569 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-internal-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.119782 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.119883 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-combined-ca-bundle\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.119980 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-config\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.120140 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-ovndb-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.126609 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-config\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.126901 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.127194 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-combined-ca-bundle\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.127265 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-httpd-config\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.128242 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-ovndb-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.131614 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-internal-tls-certs\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.140699 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2fn7\" (UniqueName: \"kubernetes.io/projected/058dce91-7662-43cb-bcda-d7d44e59029b-kube-api-access-d2fn7\") pod \"neutron-79dfdd6449-flhwk\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.211742 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:42 crc kubenswrapper[4626]: I0223 08:05:42.848390 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79dfdd6449-flhwk"] Feb 23 08:05:43 crc kubenswrapper[4626]: I0223 08:05:43.069599 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79dfdd6449-flhwk" event={"ID":"058dce91-7662-43cb-bcda-d7d44e59029b","Type":"ContainerStarted","Data":"32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce"} Feb 23 08:05:43 crc kubenswrapper[4626]: I0223 08:05:43.070004 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79dfdd6449-flhwk" event={"ID":"058dce91-7662-43cb-bcda-d7d44e59029b","Type":"ContainerStarted","Data":"be08365417306c286e226e76de12635afc2e8f8a43c258620915a2e942f6b25e"} Feb 23 08:05:44 crc kubenswrapper[4626]: I0223 08:05:44.083740 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79dfdd6449-flhwk" event={"ID":"058dce91-7662-43cb-bcda-d7d44e59029b","Type":"ContainerStarted","Data":"c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b"} Feb 23 08:05:44 crc kubenswrapper[4626]: I0223 08:05:44.084047 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:05:44 crc kubenswrapper[4626]: I0223 08:05:44.109845 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79dfdd6449-flhwk" podStartSLOduration=3.109825741 podStartE2EDuration="3.109825741s" podCreationTimestamp="2026-02-23 08:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:05:44.1011594 +0000 UTC m=+5096.440488667" watchObservedRunningTime="2026-02-23 08:05:44.109825741 +0000 UTC m=+5096.449155027" Feb 23 08:05:55 crc kubenswrapper[4626]: I0223 08:05:55.685548 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:05:55 crc kubenswrapper[4626]: I0223 08:05:55.686135 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:06:12 crc kubenswrapper[4626]: I0223 08:06:12.362130 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:06:12 crc kubenswrapper[4626]: I0223 08:06:12.473970 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-755f5b5889-45mmc"] Feb 23 08:06:12 crc kubenswrapper[4626]: I0223 08:06:12.474218 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-755f5b5889-45mmc" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-api" containerID="cri-o://bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c" gracePeriod=30 Feb 23 08:06:12 crc kubenswrapper[4626]: I0223 08:06:12.474293 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-755f5b5889-45mmc" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-httpd" containerID="cri-o://3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25" gracePeriod=30 Feb 23 08:06:13 crc kubenswrapper[4626]: I0223 08:06:13.394853 4626 generic.go:334] "Generic (PLEG): container finished" podID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerID="3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25" exitCode=0 Feb 23 08:06:13 crc kubenswrapper[4626]: I0223 08:06:13.394927 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755f5b5889-45mmc" event={"ID":"3a761443-78ec-4b4c-8e91-fb2ff1061771","Type":"ContainerDied","Data":"3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25"} Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.236294 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755f5b5889-45mmc" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.247406 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-ovndb-tls-certs\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.247541 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-httpd-config\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.247592 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-public-tls-certs\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.247724 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjmpt\" (UniqueName: \"kubernetes.io/projected/3a761443-78ec-4b4c-8e91-fb2ff1061771-kube-api-access-rjmpt\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.247864 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-internal-tls-certs\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.247916 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-combined-ca-bundle\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.248225 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-config\") pod \"3a761443-78ec-4b4c-8e91-fb2ff1061771\" (UID: \"3a761443-78ec-4b4c-8e91-fb2ff1061771\") " Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.262990 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.272801 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a761443-78ec-4b4c-8e91-fb2ff1061771-kube-api-access-rjmpt" (OuterVolumeSpecName: "kube-api-access-rjmpt") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "kube-api-access-rjmpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.310415 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.316879 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.320047 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.321121 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.336473 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-config" (OuterVolumeSpecName: "config") pod "3a761443-78ec-4b4c-8e91-fb2ff1061771" (UID: "3a761443-78ec-4b4c-8e91-fb2ff1061771"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350808 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350835 4626 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350847 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350858 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350867 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjmpt\" (UniqueName: \"kubernetes.io/projected/3a761443-78ec-4b4c-8e91-fb2ff1061771-kube-api-access-rjmpt\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350876 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.350884 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a761443-78ec-4b4c-8e91-fb2ff1061771-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.534836 4626 generic.go:334] "Generic (PLEG): container finished" podID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerID="bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c" exitCode=0 Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.534918 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755f5b5889-45mmc" event={"ID":"3a761443-78ec-4b4c-8e91-fb2ff1061771","Type":"ContainerDied","Data":"bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c"} Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.534954 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755f5b5889-45mmc" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.535001 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755f5b5889-45mmc" event={"ID":"3a761443-78ec-4b4c-8e91-fb2ff1061771","Type":"ContainerDied","Data":"336a7ee4f8ad0ead41b49175f499b39c3525a3c4be66604e7f1489059d118650"} Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.535031 4626 scope.go:117] "RemoveContainer" containerID="3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.575732 4626 scope.go:117] "RemoveContainer" containerID="bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.578795 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-755f5b5889-45mmc"] Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.588073 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-755f5b5889-45mmc"] Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.606486 4626 scope.go:117] "RemoveContainer" containerID="3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25" Feb 23 08:06:25 crc kubenswrapper[4626]: E0223 08:06:25.606962 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25\": container with ID starting with 3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25 not found: ID does not exist" containerID="3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.607007 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25"} err="failed to get container status \"3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25\": rpc error: code = NotFound desc = could not find container \"3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25\": container with ID starting with 3b63b0c3241e25d78dd8d61338726da8b3830e9e3c51cdb60f4099e48d34ac25 not found: ID does not exist" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.607036 4626 scope.go:117] "RemoveContainer" containerID="bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c" Feb 23 08:06:25 crc kubenswrapper[4626]: E0223 08:06:25.607468 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c\": container with ID starting with bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c not found: ID does not exist" containerID="bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.607603 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c"} err="failed to get container status \"bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c\": rpc error: code = NotFound desc = could not find container \"bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c\": container with ID starting with bb785637be390c00e626bfff61788204809e22b1aedb4e3b44500460c2ddfb4c not found: ID does not exist" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.685888 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.685947 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.685997 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.686979 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.687046 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" gracePeriod=600 Feb 23 08:06:25 crc kubenswrapper[4626]: E0223 08:06:25.810143 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:06:25 crc kubenswrapper[4626]: I0223 08:06:25.992698 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" path="/var/lib/kubelet/pods/3a761443-78ec-4b4c-8e91-fb2ff1061771/volumes" Feb 23 08:06:26 crc kubenswrapper[4626]: I0223 08:06:26.547905 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" exitCode=0 Feb 23 08:06:26 crc kubenswrapper[4626]: I0223 08:06:26.547956 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b"} Feb 23 08:06:26 crc kubenswrapper[4626]: I0223 08:06:26.547996 4626 scope.go:117] "RemoveContainer" containerID="e459a9ed2ad6229b1cb787ff55877ab11d2a5f9aa86208083379ca7a8a6dc7f0" Feb 23 08:06:26 crc kubenswrapper[4626]: I0223 08:06:26.548938 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:06:26 crc kubenswrapper[4626]: E0223 08:06:26.549363 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:06:38 crc kubenswrapper[4626]: I0223 08:06:38.982666 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:06:38 crc kubenswrapper[4626]: E0223 08:06:38.983472 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:06:51 crc kubenswrapper[4626]: I0223 08:06:51.982226 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:06:51 crc kubenswrapper[4626]: E0223 08:06:51.982856 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:07:02 crc kubenswrapper[4626]: I0223 08:07:02.981830 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:07:02 crc kubenswrapper[4626]: E0223 08:07:02.982632 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:07:14 crc kubenswrapper[4626]: I0223 08:07:14.982104 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:07:14 crc kubenswrapper[4626]: E0223 08:07:14.982982 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:07:29 crc kubenswrapper[4626]: I0223 08:07:29.981864 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:07:29 crc kubenswrapper[4626]: E0223 08:07:29.983179 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:07:40 crc kubenswrapper[4626]: I0223 08:07:40.982799 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:07:40 crc kubenswrapper[4626]: E0223 08:07:40.983619 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:07:54 crc kubenswrapper[4626]: I0223 08:07:54.982846 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:07:54 crc kubenswrapper[4626]: E0223 08:07:54.983418 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:08:06 crc kubenswrapper[4626]: I0223 08:08:06.982389 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:08:06 crc kubenswrapper[4626]: E0223 08:08:06.983705 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:08:19 crc kubenswrapper[4626]: I0223 08:08:19.982374 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:08:19 crc kubenswrapper[4626]: E0223 08:08:19.983492 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:08:30 crc kubenswrapper[4626]: I0223 08:08:30.983790 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:08:30 crc kubenswrapper[4626]: E0223 08:08:30.984955 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:08:45 crc kubenswrapper[4626]: I0223 08:08:45.982026 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:08:45 crc kubenswrapper[4626]: E0223 08:08:45.982855 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:09:00 crc kubenswrapper[4626]: I0223 08:09:00.982800 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:09:00 crc kubenswrapper[4626]: E0223 08:09:00.983937 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:09:15 crc kubenswrapper[4626]: I0223 08:09:15.982559 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:09:15 crc kubenswrapper[4626]: E0223 08:09:15.983387 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:09:27 crc kubenswrapper[4626]: I0223 08:09:27.988323 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:09:27 crc kubenswrapper[4626]: E0223 08:09:27.989365 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:09:41 crc kubenswrapper[4626]: I0223 08:09:41.982695 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:09:41 crc kubenswrapper[4626]: E0223 08:09:41.983787 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:09:56 crc kubenswrapper[4626]: I0223 08:09:56.982104 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:09:56 crc kubenswrapper[4626]: E0223 08:09:56.984273 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:10:08 crc kubenswrapper[4626]: I0223 08:10:08.981863 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:10:08 crc kubenswrapper[4626]: E0223 08:10:08.982680 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:10:23 crc kubenswrapper[4626]: I0223 08:10:23.982412 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:10:23 crc kubenswrapper[4626]: E0223 08:10:23.983747 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:10:35 crc kubenswrapper[4626]: I0223 08:10:35.981887 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:10:35 crc kubenswrapper[4626]: E0223 08:10:35.982781 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:10:50 crc kubenswrapper[4626]: I0223 08:10:50.982046 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:10:50 crc kubenswrapper[4626]: E0223 08:10:50.982901 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:11:04 crc kubenswrapper[4626]: I0223 08:11:04.981799 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:11:04 crc kubenswrapper[4626]: E0223 08:11:04.982603 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:11:15 crc kubenswrapper[4626]: I0223 08:11:15.983256 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:11:15 crc kubenswrapper[4626]: E0223 08:11:15.984317 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.852489 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rqg8g"] Feb 23 08:11:24 crc kubenswrapper[4626]: E0223 08:11:24.853555 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-api" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.853570 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-api" Feb 23 08:11:24 crc kubenswrapper[4626]: E0223 08:11:24.853602 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-httpd" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.853608 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-httpd" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.853845 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-api" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.853865 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a761443-78ec-4b4c-8e91-fb2ff1061771" containerName="neutron-httpd" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.855266 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:24 crc kubenswrapper[4626]: I0223 08:11:24.869214 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqg8g"] Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.006249 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hs5\" (UniqueName: \"kubernetes.io/projected/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-kube-api-access-l6hs5\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.006882 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-catalog-content\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.007261 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-utilities\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.109721 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hs5\" (UniqueName: \"kubernetes.io/projected/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-kube-api-access-l6hs5\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.109806 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-catalog-content\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.109841 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-utilities\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.110301 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-catalog-content\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.110627 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-utilities\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.132297 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hs5\" (UniqueName: \"kubernetes.io/projected/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-kube-api-access-l6hs5\") pod \"redhat-operators-rqg8g\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.175961 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:25 crc kubenswrapper[4626]: I0223 08:11:25.622339 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rqg8g"] Feb 23 08:11:26 crc kubenswrapper[4626]: I0223 08:11:26.114276 4626 generic.go:334] "Generic (PLEG): container finished" podID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerID="58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc" exitCode=0 Feb 23 08:11:26 crc kubenswrapper[4626]: I0223 08:11:26.114399 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerDied","Data":"58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc"} Feb 23 08:11:26 crc kubenswrapper[4626]: I0223 08:11:26.114675 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerStarted","Data":"f0c4a33e0e38f0c5923ce39105967073237f85fbda2658828204feecd00e1d79"} Feb 23 08:11:26 crc kubenswrapper[4626]: I0223 08:11:26.119285 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:11:27 crc kubenswrapper[4626]: I0223 08:11:27.128174 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerStarted","Data":"fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee"} Feb 23 08:11:30 crc kubenswrapper[4626]: I0223 08:11:30.153806 4626 generic.go:334] "Generic (PLEG): container finished" podID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerID="fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee" exitCode=0 Feb 23 08:11:30 crc kubenswrapper[4626]: I0223 08:11:30.153877 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerDied","Data":"fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee"} Feb 23 08:11:30 crc kubenswrapper[4626]: I0223 08:11:30.982957 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:11:31 crc kubenswrapper[4626]: I0223 08:11:31.163829 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerStarted","Data":"d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06"} Feb 23 08:11:31 crc kubenswrapper[4626]: I0223 08:11:31.165878 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"b9c72109d4552546be7635bcac855a1a5b0f1ea902139ce188fbb97d9a9a11fe"} Feb 23 08:11:31 crc kubenswrapper[4626]: I0223 08:11:31.190473 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rqg8g" podStartSLOduration=2.6875938660000003 podStartE2EDuration="7.190456612s" podCreationTimestamp="2026-02-23 08:11:24 +0000 UTC" firstStartedPulling="2026-02-23 08:11:26.118061603 +0000 UTC m=+5438.457390869" lastFinishedPulling="2026-02-23 08:11:30.62092435 +0000 UTC m=+5442.960253615" observedRunningTime="2026-02-23 08:11:31.182849831 +0000 UTC m=+5443.522179097" watchObservedRunningTime="2026-02-23 08:11:31.190456612 +0000 UTC m=+5443.529785878" Feb 23 08:11:35 crc kubenswrapper[4626]: I0223 08:11:35.176686 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:35 crc kubenswrapper[4626]: I0223 08:11:35.177482 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:36 crc kubenswrapper[4626]: I0223 08:11:36.231266 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rqg8g" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="registry-server" probeResult="failure" output=< Feb 23 08:11:36 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:11:36 crc kubenswrapper[4626]: > Feb 23 08:11:45 crc kubenswrapper[4626]: I0223 08:11:45.214756 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:45 crc kubenswrapper[4626]: I0223 08:11:45.255174 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:45 crc kubenswrapper[4626]: I0223 08:11:45.450476 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqg8g"] Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.292923 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rqg8g" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="registry-server" containerID="cri-o://d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06" gracePeriod=2 Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.845212 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.986255 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-utilities\") pod \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.986415 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6hs5\" (UniqueName: \"kubernetes.io/projected/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-kube-api-access-l6hs5\") pod \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.986925 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-catalog-content\") pod \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\" (UID: \"6bc862c8-0e1a-44c9-88f5-761f1deb32a6\") " Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.987493 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-utilities" (OuterVolumeSpecName: "utilities") pod "6bc862c8-0e1a-44c9-88f5-761f1deb32a6" (UID: "6bc862c8-0e1a-44c9-88f5-761f1deb32a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.987786 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:11:46 crc kubenswrapper[4626]: I0223 08:11:46.995658 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-kube-api-access-l6hs5" (OuterVolumeSpecName: "kube-api-access-l6hs5") pod "6bc862c8-0e1a-44c9-88f5-761f1deb32a6" (UID: "6bc862c8-0e1a-44c9-88f5-761f1deb32a6"). InnerVolumeSpecName "kube-api-access-l6hs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.081129 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bc862c8-0e1a-44c9-88f5-761f1deb32a6" (UID: "6bc862c8-0e1a-44c9-88f5-761f1deb32a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.090767 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6hs5\" (UniqueName: \"kubernetes.io/projected/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-kube-api-access-l6hs5\") on node \"crc\" DevicePath \"\"" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.090805 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc862c8-0e1a-44c9-88f5-761f1deb32a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.307441 4626 generic.go:334] "Generic (PLEG): container finished" podID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerID="d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06" exitCode=0 Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.307478 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerDied","Data":"d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06"} Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.307547 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rqg8g" event={"ID":"6bc862c8-0e1a-44c9-88f5-761f1deb32a6","Type":"ContainerDied","Data":"f0c4a33e0e38f0c5923ce39105967073237f85fbda2658828204feecd00e1d79"} Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.307570 4626 scope.go:117] "RemoveContainer" containerID="d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.307491 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rqg8g" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.328528 4626 scope.go:117] "RemoveContainer" containerID="fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.339197 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rqg8g"] Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.346587 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rqg8g"] Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.365669 4626 scope.go:117] "RemoveContainer" containerID="58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.384325 4626 scope.go:117] "RemoveContainer" containerID="d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06" Feb 23 08:11:47 crc kubenswrapper[4626]: E0223 08:11:47.384794 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06\": container with ID starting with d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06 not found: ID does not exist" containerID="d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.384825 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06"} err="failed to get container status \"d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06\": rpc error: code = NotFound desc = could not find container \"d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06\": container with ID starting with d1ff7118a81636793b6ca8e08be50595b613e4efe7803ccc1d71b9ee90bbbb06 not found: ID does not exist" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.384844 4626 scope.go:117] "RemoveContainer" containerID="fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee" Feb 23 08:11:47 crc kubenswrapper[4626]: E0223 08:11:47.385125 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee\": container with ID starting with fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee not found: ID does not exist" containerID="fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.385149 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee"} err="failed to get container status \"fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee\": rpc error: code = NotFound desc = could not find container \"fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee\": container with ID starting with fc63f3fb0f7f0e2970e84bbab890d764ee0ae25b6b2b3b836559ed845b3b6aee not found: ID does not exist" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.385166 4626 scope.go:117] "RemoveContainer" containerID="58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc" Feb 23 08:11:47 crc kubenswrapper[4626]: E0223 08:11:47.385764 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc\": container with ID starting with 58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc not found: ID does not exist" containerID="58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.385786 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc"} err="failed to get container status \"58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc\": rpc error: code = NotFound desc = could not find container \"58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc\": container with ID starting with 58f01a835e963776847a85202afedf9afd86d9c7effc214e824132d52805babc not found: ID does not exist" Feb 23 08:11:47 crc kubenswrapper[4626]: I0223 08:11:47.994184 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" path="/var/lib/kubelet/pods/6bc862c8-0e1a-44c9-88f5-761f1deb32a6/volumes" Feb 23 08:13:47 crc kubenswrapper[4626]: I0223 08:13:47.620366 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6465458495-hgsdz" podUID="434e199c-4e18-4274-bbaa-f81f2e2a697b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 23 08:13:55 crc kubenswrapper[4626]: I0223 08:13:55.685264 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:13:55 crc kubenswrapper[4626]: I0223 08:13:55.685637 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:14:25 crc kubenswrapper[4626]: I0223 08:14:25.685372 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:14:25 crc kubenswrapper[4626]: I0223 08:14:25.686027 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:14:47 crc kubenswrapper[4626]: I0223 08:14:47.618121 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6465458495-hgsdz" podUID="434e199c-4e18-4274-bbaa-f81f2e2a697b" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.684883 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.685650 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.685721 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.687243 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9c72109d4552546be7635bcac855a1a5b0f1ea902139ce188fbb97d9a9a11fe"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.687337 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://b9c72109d4552546be7635bcac855a1a5b0f1ea902139ce188fbb97d9a9a11fe" gracePeriod=600 Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.904136 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="b9c72109d4552546be7635bcac855a1a5b0f1ea902139ce188fbb97d9a9a11fe" exitCode=0 Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.904383 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"b9c72109d4552546be7635bcac855a1a5b0f1ea902139ce188fbb97d9a9a11fe"} Feb 23 08:14:55 crc kubenswrapper[4626]: I0223 08:14:55.904438 4626 scope.go:117] "RemoveContainer" containerID="d36a61aae14c12ca080cae233885b15b58f2526fb63c08ed77fea453ddfe117b" Feb 23 08:14:56 crc kubenswrapper[4626]: I0223 08:14:56.913831 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22"} Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.173174 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj"] Feb 23 08:15:00 crc kubenswrapper[4626]: E0223 08:15:00.174342 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.174361 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[4626]: E0223 08:15:00.174406 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.174412 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[4626]: E0223 08:15:00.174423 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.174429 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.174664 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc862c8-0e1a-44c9-88f5-761f1deb32a6" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.175311 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.187947 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj"] Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.188760 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.188769 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.359799 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c354d1-c9bb-4808-b7ea-a63b849e5c77-secret-volume\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.360197 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6nm5\" (UniqueName: \"kubernetes.io/projected/55c354d1-c9bb-4808-b7ea-a63b849e5c77-kube-api-access-n6nm5\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.360530 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c354d1-c9bb-4808-b7ea-a63b849e5c77-config-volume\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.462615 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c354d1-c9bb-4808-b7ea-a63b849e5c77-config-volume\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.462689 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c354d1-c9bb-4808-b7ea-a63b849e5c77-secret-volume\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.462786 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6nm5\" (UniqueName: \"kubernetes.io/projected/55c354d1-c9bb-4808-b7ea-a63b849e5c77-kube-api-access-n6nm5\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.463430 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c354d1-c9bb-4808-b7ea-a63b849e5c77-config-volume\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.474488 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c354d1-c9bb-4808-b7ea-a63b849e5c77-secret-volume\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.478662 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6nm5\" (UniqueName: \"kubernetes.io/projected/55c354d1-c9bb-4808-b7ea-a63b849e5c77-kube-api-access-n6nm5\") pod \"collect-profiles-29530575-8hxfj\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.509332 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:00 crc kubenswrapper[4626]: I0223 08:15:00.953631 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj"] Feb 23 08:15:00 crc kubenswrapper[4626]: W0223 08:15:00.967399 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c354d1_c9bb_4808_b7ea_a63b849e5c77.slice/crio-d4baeb7b46e1d89718b654a8ec675c72112e84de02047de8b3f1c0e864c83036 WatchSource:0}: Error finding container d4baeb7b46e1d89718b654a8ec675c72112e84de02047de8b3f1c0e864c83036: Status 404 returned error can't find the container with id d4baeb7b46e1d89718b654a8ec675c72112e84de02047de8b3f1c0e864c83036 Feb 23 08:15:01 crc kubenswrapper[4626]: I0223 08:15:01.959037 4626 generic.go:334] "Generic (PLEG): container finished" podID="55c354d1-c9bb-4808-b7ea-a63b849e5c77" containerID="6554d67d66e6cf9055b893ce40e0039ffcd768076076c6a6f50e31868a739073" exitCode=0 Feb 23 08:15:01 crc kubenswrapper[4626]: I0223 08:15:01.959152 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" event={"ID":"55c354d1-c9bb-4808-b7ea-a63b849e5c77","Type":"ContainerDied","Data":"6554d67d66e6cf9055b893ce40e0039ffcd768076076c6a6f50e31868a739073"} Feb 23 08:15:01 crc kubenswrapper[4626]: I0223 08:15:01.960700 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" event={"ID":"55c354d1-c9bb-4808-b7ea-a63b849e5c77","Type":"ContainerStarted","Data":"d4baeb7b46e1d89718b654a8ec675c72112e84de02047de8b3f1c0e864c83036"} Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.240701 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.432572 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c354d1-c9bb-4808-b7ea-a63b849e5c77-config-volume\") pod \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.432651 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6nm5\" (UniqueName: \"kubernetes.io/projected/55c354d1-c9bb-4808-b7ea-a63b849e5c77-kube-api-access-n6nm5\") pod \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.432688 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c354d1-c9bb-4808-b7ea-a63b849e5c77-secret-volume\") pod \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\" (UID: \"55c354d1-c9bb-4808-b7ea-a63b849e5c77\") " Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.433427 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c354d1-c9bb-4808-b7ea-a63b849e5c77-config-volume" (OuterVolumeSpecName: "config-volume") pod "55c354d1-c9bb-4808-b7ea-a63b849e5c77" (UID: "55c354d1-c9bb-4808-b7ea-a63b849e5c77"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.438871 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c354d1-c9bb-4808-b7ea-a63b849e5c77-kube-api-access-n6nm5" (OuterVolumeSpecName: "kube-api-access-n6nm5") pod "55c354d1-c9bb-4808-b7ea-a63b849e5c77" (UID: "55c354d1-c9bb-4808-b7ea-a63b849e5c77"). InnerVolumeSpecName "kube-api-access-n6nm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.439005 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c354d1-c9bb-4808-b7ea-a63b849e5c77-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "55c354d1-c9bb-4808-b7ea-a63b849e5c77" (UID: "55c354d1-c9bb-4808-b7ea-a63b849e5c77"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.535256 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c354d1-c9bb-4808-b7ea-a63b849e5c77-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.535292 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6nm5\" (UniqueName: \"kubernetes.io/projected/55c354d1-c9bb-4808-b7ea-a63b849e5c77-kube-api-access-n6nm5\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.535304 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c354d1-c9bb-4808-b7ea-a63b849e5c77-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.978038 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" event={"ID":"55c354d1-c9bb-4808-b7ea-a63b849e5c77","Type":"ContainerDied","Data":"d4baeb7b46e1d89718b654a8ec675c72112e84de02047de8b3f1c0e864c83036"} Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.978105 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj" Feb 23 08:15:03 crc kubenswrapper[4626]: I0223 08:15:03.978086 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4baeb7b46e1d89718b654a8ec675c72112e84de02047de8b3f1c0e864c83036" Feb 23 08:15:04 crc kubenswrapper[4626]: I0223 08:15:04.315709 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d"] Feb 23 08:15:04 crc kubenswrapper[4626]: I0223 08:15:04.323202 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8f76d"] Feb 23 08:15:05 crc kubenswrapper[4626]: I0223 08:15:05.991468 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8553bf-f699-41d1-b085-de32675318d6" path="/var/lib/kubelet/pods/ce8553bf-f699-41d1-b085-de32675318d6/volumes" Feb 23 08:15:23 crc kubenswrapper[4626]: I0223 08:15:23.405235 4626 scope.go:117] "RemoveContainer" containerID="2f70721d91afcc09a7dcdba6a750563c8a5d2edeee33e28570d0815a8959fec6" Feb 23 08:16:03 crc kubenswrapper[4626]: E0223 08:16:03.334210 4626 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.58:45048->192.168.26.58:46805: write tcp 192.168.26.58:45048->192.168.26.58:46805: write: broken pipe Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.058178 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7nwb2"] Feb 23 08:16:28 crc kubenswrapper[4626]: E0223 08:16:28.058877 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c354d1-c9bb-4808-b7ea-a63b849e5c77" containerName="collect-profiles" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.058893 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c354d1-c9bb-4808-b7ea-a63b849e5c77" containerName="collect-profiles" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.059102 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c354d1-c9bb-4808-b7ea-a63b849e5c77" containerName="collect-profiles" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.062536 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.072347 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7nwb2"] Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.111203 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-utilities\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.111241 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-catalog-content\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.111470 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5hx\" (UniqueName: \"kubernetes.io/projected/9698c50c-444e-417f-b1f5-ff74c10b5a5d-kube-api-access-jv5hx\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.213075 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5hx\" (UniqueName: \"kubernetes.io/projected/9698c50c-444e-417f-b1f5-ff74c10b5a5d-kube-api-access-jv5hx\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.213197 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-utilities\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.213214 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-catalog-content\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.213719 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-catalog-content\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.213925 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-utilities\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.231930 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5hx\" (UniqueName: \"kubernetes.io/projected/9698c50c-444e-417f-b1f5-ff74c10b5a5d-kube-api-access-jv5hx\") pod \"certified-operators-7nwb2\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.377816 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:28 crc kubenswrapper[4626]: I0223 08:16:28.840654 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7nwb2"] Feb 23 08:16:29 crc kubenswrapper[4626]: I0223 08:16:29.675752 4626 generic.go:334] "Generic (PLEG): container finished" podID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerID="95010924aa1c7d82b0d4b96966bb871bb368ca5e42e5e63c4938b5bb14052585" exitCode=0 Feb 23 08:16:29 crc kubenswrapper[4626]: I0223 08:16:29.675935 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerDied","Data":"95010924aa1c7d82b0d4b96966bb871bb368ca5e42e5e63c4938b5bb14052585"} Feb 23 08:16:29 crc kubenswrapper[4626]: I0223 08:16:29.675964 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerStarted","Data":"5ec8277ef0bafaed8baa45217a470f62ffe4e81311bb51e2d63c3e35226cb1c1"} Feb 23 08:16:29 crc kubenswrapper[4626]: I0223 08:16:29.677675 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:16:30 crc kubenswrapper[4626]: I0223 08:16:30.684361 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerStarted","Data":"8cca6d49fd64cbf5945125a4fc07c92940231e56c2597238376de92bae67566d"} Feb 23 08:16:31 crc kubenswrapper[4626]: I0223 08:16:31.696966 4626 generic.go:334] "Generic (PLEG): container finished" podID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerID="8cca6d49fd64cbf5945125a4fc07c92940231e56c2597238376de92bae67566d" exitCode=0 Feb 23 08:16:31 crc kubenswrapper[4626]: I0223 08:16:31.697613 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerDied","Data":"8cca6d49fd64cbf5945125a4fc07c92940231e56c2597238376de92bae67566d"} Feb 23 08:16:32 crc kubenswrapper[4626]: I0223 08:16:32.726569 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerStarted","Data":"85f858f02e14c7005cd0ab2885b2c27c9ea71a7edd30bd41f3e549269af4f4bb"} Feb 23 08:16:32 crc kubenswrapper[4626]: I0223 08:16:32.750085 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7nwb2" podStartSLOduration=2.261204518 podStartE2EDuration="4.750059986s" podCreationTimestamp="2026-02-23 08:16:28 +0000 UTC" firstStartedPulling="2026-02-23 08:16:29.677441012 +0000 UTC m=+5742.016770278" lastFinishedPulling="2026-02-23 08:16:32.16629648 +0000 UTC m=+5744.505625746" observedRunningTime="2026-02-23 08:16:32.744958958 +0000 UTC m=+5745.084288223" watchObservedRunningTime="2026-02-23 08:16:32.750059986 +0000 UTC m=+5745.089389252" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.052933 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xdtzt"] Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.059882 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.067164 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdtzt"] Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.070487 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-catalog-content\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.070590 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-utilities\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.070827 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtfn\" (UniqueName: \"kubernetes.io/projected/39929d12-fba9-4814-b489-b95352112bca-kube-api-access-4mtfn\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.173546 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-catalog-content\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.173897 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-utilities\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.174035 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-catalog-content\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.174529 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtfn\" (UniqueName: \"kubernetes.io/projected/39929d12-fba9-4814-b489-b95352112bca-kube-api-access-4mtfn\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.174663 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-utilities\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.194959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtfn\" (UniqueName: \"kubernetes.io/projected/39929d12-fba9-4814-b489-b95352112bca-kube-api-access-4mtfn\") pod \"community-operators-xdtzt\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:36 crc kubenswrapper[4626]: I0223 08:16:36.381781 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.050726 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9g72c"] Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.054347 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.065842 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g72c"] Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.132365 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdtzt"] Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.202525 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-utilities\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.202574 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdk8f\" (UniqueName: \"kubernetes.io/projected/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-kube-api-access-qdk8f\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.202700 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-catalog-content\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.305193 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-catalog-content\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.305556 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-utilities\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.305586 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdk8f\" (UniqueName: \"kubernetes.io/projected/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-kube-api-access-qdk8f\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.306192 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-catalog-content\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.306743 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-utilities\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.323268 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdk8f\" (UniqueName: \"kubernetes.io/projected/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-kube-api-access-qdk8f\") pod \"redhat-marketplace-9g72c\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.378230 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.767908 4626 generic.go:334] "Generic (PLEG): container finished" podID="39929d12-fba9-4814-b489-b95352112bca" containerID="ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0" exitCode=0 Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.767952 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerDied","Data":"ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0"} Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.767982 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerStarted","Data":"512c7796e2d18dd5c80c35e5987ee8266c505e8515cfe225a332f89475d302ae"} Feb 23 08:16:37 crc kubenswrapper[4626]: I0223 08:16:37.873029 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g72c"] Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.378089 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.378683 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.426720 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.786068 4626 generic.go:334] "Generic (PLEG): container finished" podID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerID="9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282" exitCode=0 Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.788019 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerDied","Data":"9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282"} Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.788079 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerStarted","Data":"9680de1f2cb1df9b1f4b09847acba9350f9fe2c11e4c92622b5f9765f4838a4f"} Feb 23 08:16:38 crc kubenswrapper[4626]: I0223 08:16:38.849445 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:39 crc kubenswrapper[4626]: I0223 08:16:39.818845 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerStarted","Data":"dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701"} Feb 23 08:16:39 crc kubenswrapper[4626]: I0223 08:16:39.821792 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerStarted","Data":"b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856"} Feb 23 08:16:40 crc kubenswrapper[4626]: I0223 08:16:40.830074 4626 generic.go:334] "Generic (PLEG): container finished" podID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerID="dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701" exitCode=0 Feb 23 08:16:40 crc kubenswrapper[4626]: I0223 08:16:40.830136 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerDied","Data":"dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701"} Feb 23 08:16:40 crc kubenswrapper[4626]: I0223 08:16:40.834252 4626 generic.go:334] "Generic (PLEG): container finished" podID="39929d12-fba9-4814-b489-b95352112bca" containerID="b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856" exitCode=0 Feb 23 08:16:40 crc kubenswrapper[4626]: I0223 08:16:40.834290 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerDied","Data":"b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856"} Feb 23 08:16:41 crc kubenswrapper[4626]: I0223 08:16:41.449304 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7nwb2"] Feb 23 08:16:41 crc kubenswrapper[4626]: I0223 08:16:41.449951 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7nwb2" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="registry-server" containerID="cri-o://85f858f02e14c7005cd0ab2885b2c27c9ea71a7edd30bd41f3e549269af4f4bb" gracePeriod=2 Feb 23 08:16:41 crc kubenswrapper[4626]: I0223 08:16:41.878662 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerStarted","Data":"d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74"} Feb 23 08:16:41 crc kubenswrapper[4626]: I0223 08:16:41.927814 4626 generic.go:334] "Generic (PLEG): container finished" podID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerID="85f858f02e14c7005cd0ab2885b2c27c9ea71a7edd30bd41f3e549269af4f4bb" exitCode=0 Feb 23 08:16:41 crc kubenswrapper[4626]: I0223 08:16:41.927999 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerDied","Data":"85f858f02e14c7005cd0ab2885b2c27c9ea71a7edd30bd41f3e549269af4f4bb"} Feb 23 08:16:41 crc kubenswrapper[4626]: I0223 08:16:41.968413 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerStarted","Data":"d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417"} Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.032631 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xdtzt" podStartSLOduration=2.492120125 podStartE2EDuration="6.032606807s" podCreationTimestamp="2026-02-23 08:16:36 +0000 UTC" firstStartedPulling="2026-02-23 08:16:37.769373422 +0000 UTC m=+5750.108702688" lastFinishedPulling="2026-02-23 08:16:41.309860103 +0000 UTC m=+5753.649189370" observedRunningTime="2026-02-23 08:16:42.02718822 +0000 UTC m=+5754.366517486" watchObservedRunningTime="2026-02-23 08:16:42.032606807 +0000 UTC m=+5754.371936072" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.033957 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9g72c" podStartSLOduration=2.486139342 podStartE2EDuration="5.033946442s" podCreationTimestamp="2026-02-23 08:16:37 +0000 UTC" firstStartedPulling="2026-02-23 08:16:38.790936763 +0000 UTC m=+5751.130266029" lastFinishedPulling="2026-02-23 08:16:41.338743862 +0000 UTC m=+5753.678073129" observedRunningTime="2026-02-23 08:16:41.915193358 +0000 UTC m=+5754.254522624" watchObservedRunningTime="2026-02-23 08:16:42.033946442 +0000 UTC m=+5754.373275708" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.214866 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.347945 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-catalog-content\") pod \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.348079 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv5hx\" (UniqueName: \"kubernetes.io/projected/9698c50c-444e-417f-b1f5-ff74c10b5a5d-kube-api-access-jv5hx\") pod \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.348158 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-utilities\") pod \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\" (UID: \"9698c50c-444e-417f-b1f5-ff74c10b5a5d\") " Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.348616 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-utilities" (OuterVolumeSpecName: "utilities") pod "9698c50c-444e-417f-b1f5-ff74c10b5a5d" (UID: "9698c50c-444e-417f-b1f5-ff74c10b5a5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.348836 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.357653 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9698c50c-444e-417f-b1f5-ff74c10b5a5d-kube-api-access-jv5hx" (OuterVolumeSpecName: "kube-api-access-jv5hx") pod "9698c50c-444e-417f-b1f5-ff74c10b5a5d" (UID: "9698c50c-444e-417f-b1f5-ff74c10b5a5d"). InnerVolumeSpecName "kube-api-access-jv5hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.381863 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9698c50c-444e-417f-b1f5-ff74c10b5a5d" (UID: "9698c50c-444e-417f-b1f5-ff74c10b5a5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.451719 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv5hx\" (UniqueName: \"kubernetes.io/projected/9698c50c-444e-417f-b1f5-ff74c10b5a5d-kube-api-access-jv5hx\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.451754 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9698c50c-444e-417f-b1f5-ff74c10b5a5d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.987389 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nwb2" event={"ID":"9698c50c-444e-417f-b1f5-ff74c10b5a5d","Type":"ContainerDied","Data":"5ec8277ef0bafaed8baa45217a470f62ffe4e81311bb51e2d63c3e35226cb1c1"} Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.987419 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nwb2" Feb 23 08:16:42 crc kubenswrapper[4626]: I0223 08:16:42.987450 4626 scope.go:117] "RemoveContainer" containerID="85f858f02e14c7005cd0ab2885b2c27c9ea71a7edd30bd41f3e549269af4f4bb" Feb 23 08:16:43 crc kubenswrapper[4626]: I0223 08:16:43.012453 4626 scope.go:117] "RemoveContainer" containerID="8cca6d49fd64cbf5945125a4fc07c92940231e56c2597238376de92bae67566d" Feb 23 08:16:43 crc kubenswrapper[4626]: I0223 08:16:43.020017 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7nwb2"] Feb 23 08:16:43 crc kubenswrapper[4626]: I0223 08:16:43.029226 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7nwb2"] Feb 23 08:16:43 crc kubenswrapper[4626]: I0223 08:16:43.035980 4626 scope.go:117] "RemoveContainer" containerID="95010924aa1c7d82b0d4b96966bb871bb368ca5e42e5e63c4938b5bb14052585" Feb 23 08:16:43 crc kubenswrapper[4626]: I0223 08:16:43.991621 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" path="/var/lib/kubelet/pods/9698c50c-444e-417f-b1f5-ff74c10b5a5d/volumes" Feb 23 08:16:46 crc kubenswrapper[4626]: I0223 08:16:46.382001 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:46 crc kubenswrapper[4626]: I0223 08:16:46.382359 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:46 crc kubenswrapper[4626]: I0223 08:16:46.421291 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:47 crc kubenswrapper[4626]: I0223 08:16:47.071565 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:47 crc kubenswrapper[4626]: I0223 08:16:47.379617 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:47 crc kubenswrapper[4626]: I0223 08:16:47.379671 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:47 crc kubenswrapper[4626]: I0223 08:16:47.417888 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:48 crc kubenswrapper[4626]: I0223 08:16:48.076063 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.244744 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdtzt"] Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.245234 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xdtzt" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="registry-server" containerID="cri-o://d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417" gracePeriod=2 Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.667665 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.824363 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mtfn\" (UniqueName: \"kubernetes.io/projected/39929d12-fba9-4814-b489-b95352112bca-kube-api-access-4mtfn\") pod \"39929d12-fba9-4814-b489-b95352112bca\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.824584 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-utilities\") pod \"39929d12-fba9-4814-b489-b95352112bca\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.824643 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-catalog-content\") pod \"39929d12-fba9-4814-b489-b95352112bca\" (UID: \"39929d12-fba9-4814-b489-b95352112bca\") " Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.825167 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-utilities" (OuterVolumeSpecName: "utilities") pod "39929d12-fba9-4814-b489-b95352112bca" (UID: "39929d12-fba9-4814-b489-b95352112bca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.825270 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.830641 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39929d12-fba9-4814-b489-b95352112bca-kube-api-access-4mtfn" (OuterVolumeSpecName: "kube-api-access-4mtfn") pod "39929d12-fba9-4814-b489-b95352112bca" (UID: "39929d12-fba9-4814-b489-b95352112bca"). InnerVolumeSpecName "kube-api-access-4mtfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.845970 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g72c"] Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.876799 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39929d12-fba9-4814-b489-b95352112bca" (UID: "39929d12-fba9-4814-b489-b95352112bca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.926045 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39929d12-fba9-4814-b489-b95352112bca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:49 crc kubenswrapper[4626]: I0223 08:16:49.926081 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mtfn\" (UniqueName: \"kubernetes.io/projected/39929d12-fba9-4814-b489-b95352112bca-kube-api-access-4mtfn\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.067607 4626 generic.go:334] "Generic (PLEG): container finished" podID="39929d12-fba9-4814-b489-b95352112bca" containerID="d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417" exitCode=0 Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.067904 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdtzt" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.067814 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerDied","Data":"d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417"} Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.067962 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdtzt" event={"ID":"39929d12-fba9-4814-b489-b95352112bca","Type":"ContainerDied","Data":"512c7796e2d18dd5c80c35e5987ee8266c505e8515cfe225a332f89475d302ae"} Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.067990 4626 scope.go:117] "RemoveContainer" containerID="d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.068760 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9g72c" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="registry-server" containerID="cri-o://d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74" gracePeriod=2 Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.094584 4626 scope.go:117] "RemoveContainer" containerID="b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.108336 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdtzt"] Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.117246 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xdtzt"] Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.122324 4626 scope.go:117] "RemoveContainer" containerID="ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.214553 4626 scope.go:117] "RemoveContainer" containerID="d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417" Feb 23 08:16:50 crc kubenswrapper[4626]: E0223 08:16:50.215760 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417\": container with ID starting with d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417 not found: ID does not exist" containerID="d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.215819 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417"} err="failed to get container status \"d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417\": rpc error: code = NotFound desc = could not find container \"d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417\": container with ID starting with d36c31b62edea6e92de8168929d2d97ce6eba13cd5a11546ddd8b9abd8834417 not found: ID does not exist" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.215859 4626 scope.go:117] "RemoveContainer" containerID="b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856" Feb 23 08:16:50 crc kubenswrapper[4626]: E0223 08:16:50.216351 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856\": container with ID starting with b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856 not found: ID does not exist" containerID="b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.216382 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856"} err="failed to get container status \"b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856\": rpc error: code = NotFound desc = could not find container \"b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856\": container with ID starting with b31285af8373d9799700d084a2c1df0595b7aa8ba60651b7270380ff72df0856 not found: ID does not exist" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.216403 4626 scope.go:117] "RemoveContainer" containerID="ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0" Feb 23 08:16:50 crc kubenswrapper[4626]: E0223 08:16:50.216744 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0\": container with ID starting with ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0 not found: ID does not exist" containerID="ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.216862 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0"} err="failed to get container status \"ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0\": rpc error: code = NotFound desc = could not find container \"ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0\": container with ID starting with ebf7164d0925eb9acdc3f9e08c8c2076d85642236c3a37f6ad9eef8eced720e0 not found: ID does not exist" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.468167 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.643011 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdk8f\" (UniqueName: \"kubernetes.io/projected/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-kube-api-access-qdk8f\") pod \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.643151 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-utilities\") pod \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.643218 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-catalog-content\") pod \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\" (UID: \"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21\") " Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.643691 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-utilities" (OuterVolumeSpecName: "utilities") pod "8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" (UID: "8e009afb-3d39-4f5c-b7fd-89d27f8e8e21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.644025 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.648349 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-kube-api-access-qdk8f" (OuterVolumeSpecName: "kube-api-access-qdk8f") pod "8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" (UID: "8e009afb-3d39-4f5c-b7fd-89d27f8e8e21"). InnerVolumeSpecName "kube-api-access-qdk8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.665725 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" (UID: "8e009afb-3d39-4f5c-b7fd-89d27f8e8e21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.745691 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:50 crc kubenswrapper[4626]: I0223 08:16:50.745725 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdk8f\" (UniqueName: \"kubernetes.io/projected/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21-kube-api-access-qdk8f\") on node \"crc\" DevicePath \"\"" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.081977 4626 generic.go:334] "Generic (PLEG): container finished" podID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerID="d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74" exitCode=0 Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.082066 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9g72c" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.082058 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerDied","Data":"d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74"} Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.082410 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9g72c" event={"ID":"8e009afb-3d39-4f5c-b7fd-89d27f8e8e21","Type":"ContainerDied","Data":"9680de1f2cb1df9b1f4b09847acba9350f9fe2c11e4c92622b5f9765f4838a4f"} Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.082439 4626 scope.go:117] "RemoveContainer" containerID="d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.099491 4626 scope.go:117] "RemoveContainer" containerID="dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.120573 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g72c"] Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.128774 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9g72c"] Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.138627 4626 scope.go:117] "RemoveContainer" containerID="9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.165592 4626 scope.go:117] "RemoveContainer" containerID="d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74" Feb 23 08:16:51 crc kubenswrapper[4626]: E0223 08:16:51.172012 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74\": container with ID starting with d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74 not found: ID does not exist" containerID="d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.172063 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74"} err="failed to get container status \"d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74\": rpc error: code = NotFound desc = could not find container \"d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74\": container with ID starting with d29132a5cd5c54a4233bd9a57e03c62893f619d76459402d7e76837f43584c74 not found: ID does not exist" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.172102 4626 scope.go:117] "RemoveContainer" containerID="dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701" Feb 23 08:16:51 crc kubenswrapper[4626]: E0223 08:16:51.172965 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701\": container with ID starting with dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701 not found: ID does not exist" containerID="dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.173074 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701"} err="failed to get container status \"dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701\": rpc error: code = NotFound desc = could not find container \"dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701\": container with ID starting with dbc80d118ac87c8efa3030565a8031b44df468a161d609e3e56fc4652e5fd701 not found: ID does not exist" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.173177 4626 scope.go:117] "RemoveContainer" containerID="9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282" Feb 23 08:16:51 crc kubenswrapper[4626]: E0223 08:16:51.173789 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282\": container with ID starting with 9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282 not found: ID does not exist" containerID="9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.173828 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282"} err="failed to get container status \"9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282\": rpc error: code = NotFound desc = could not find container \"9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282\": container with ID starting with 9d1099f77d3552da490ee4eac61b6263fa4a53ba9a1425116fa15ab98d7ef282 not found: ID does not exist" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.992076 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39929d12-fba9-4814-b489-b95352112bca" path="/var/lib/kubelet/pods/39929d12-fba9-4814-b489-b95352112bca/volumes" Feb 23 08:16:51 crc kubenswrapper[4626]: I0223 08:16:51.993028 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" path="/var/lib/kubelet/pods/8e009afb-3d39-4f5c-b7fd-89d27f8e8e21/volumes" Feb 23 08:16:55 crc kubenswrapper[4626]: I0223 08:16:55.685534 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:16:55 crc kubenswrapper[4626]: I0223 08:16:55.686113 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:17:25 crc kubenswrapper[4626]: I0223 08:17:25.685091 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:17:25 crc kubenswrapper[4626]: I0223 08:17:25.685729 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:17:55 crc kubenswrapper[4626]: I0223 08:17:55.685680 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:17:55 crc kubenswrapper[4626]: I0223 08:17:55.686125 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:17:55 crc kubenswrapper[4626]: I0223 08:17:55.686176 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:17:55 crc kubenswrapper[4626]: I0223 08:17:55.686797 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:17:55 crc kubenswrapper[4626]: I0223 08:17:55.686840 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" gracePeriod=600 Feb 23 08:17:55 crc kubenswrapper[4626]: E0223 08:17:55.810068 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:17:56 crc kubenswrapper[4626]: I0223 08:17:56.620076 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" exitCode=0 Feb 23 08:17:56 crc kubenswrapper[4626]: I0223 08:17:56.620125 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22"} Feb 23 08:17:56 crc kubenswrapper[4626]: I0223 08:17:56.620175 4626 scope.go:117] "RemoveContainer" containerID="b9c72109d4552546be7635bcac855a1a5b0f1ea902139ce188fbb97d9a9a11fe" Feb 23 08:17:56 crc kubenswrapper[4626]: I0223 08:17:56.620751 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:17:56 crc kubenswrapper[4626]: E0223 08:17:56.621112 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:18:08 crc kubenswrapper[4626]: I0223 08:18:08.982628 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:18:08 crc kubenswrapper[4626]: E0223 08:18:08.983513 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:18:19 crc kubenswrapper[4626]: I0223 08:18:19.984137 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:18:19 crc kubenswrapper[4626]: E0223 08:18:19.985186 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:18:34 crc kubenswrapper[4626]: I0223 08:18:34.983247 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:18:34 crc kubenswrapper[4626]: E0223 08:18:34.984192 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:18:47 crc kubenswrapper[4626]: I0223 08:18:47.987793 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:18:47 crc kubenswrapper[4626]: E0223 08:18:47.988591 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:18:59 crc kubenswrapper[4626]: I0223 08:18:59.982328 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:18:59 crc kubenswrapper[4626]: E0223 08:18:59.984346 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:19:10 crc kubenswrapper[4626]: I0223 08:19:10.983009 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:19:10 crc kubenswrapper[4626]: E0223 08:19:10.984064 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:19:22 crc kubenswrapper[4626]: I0223 08:19:22.983177 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:19:22 crc kubenswrapper[4626]: E0223 08:19:22.984428 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:19:36 crc kubenswrapper[4626]: I0223 08:19:36.983095 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:19:36 crc kubenswrapper[4626]: E0223 08:19:36.986218 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:19:51 crc kubenswrapper[4626]: I0223 08:19:51.982893 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:19:51 crc kubenswrapper[4626]: E0223 08:19:51.984187 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:20:06 crc kubenswrapper[4626]: I0223 08:20:06.983004 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:20:06 crc kubenswrapper[4626]: E0223 08:20:06.983889 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:20:19 crc kubenswrapper[4626]: I0223 08:20:19.982642 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:20:19 crc kubenswrapper[4626]: E0223 08:20:19.983890 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:20:34 crc kubenswrapper[4626]: I0223 08:20:34.982877 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:20:34 crc kubenswrapper[4626]: E0223 08:20:34.983836 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:20:46 crc kubenswrapper[4626]: I0223 08:20:46.983713 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:20:46 crc kubenswrapper[4626]: E0223 08:20:46.985012 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:20:58 crc kubenswrapper[4626]: I0223 08:20:58.982089 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:20:58 crc kubenswrapper[4626]: E0223 08:20:58.982956 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:21:09 crc kubenswrapper[4626]: I0223 08:21:09.981977 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:21:09 crc kubenswrapper[4626]: E0223 08:21:09.983006 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:21:24 crc kubenswrapper[4626]: I0223 08:21:24.982675 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:21:24 crc kubenswrapper[4626]: E0223 08:21:24.983526 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:21:39 crc kubenswrapper[4626]: I0223 08:21:39.982199 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:21:39 crc kubenswrapper[4626]: E0223 08:21:39.983025 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:21:51 crc kubenswrapper[4626]: I0223 08:21:51.982151 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:21:51 crc kubenswrapper[4626]: E0223 08:21:51.983278 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:22:06 crc kubenswrapper[4626]: I0223 08:22:06.982484 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:22:06 crc kubenswrapper[4626]: E0223 08:22:06.983533 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:22:19 crc kubenswrapper[4626]: I0223 08:22:19.982696 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:22:19 crc kubenswrapper[4626]: E0223 08:22:19.984038 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:22:31 crc kubenswrapper[4626]: I0223 08:22:31.982732 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:22:31 crc kubenswrapper[4626]: E0223 08:22:31.984617 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:22:43 crc kubenswrapper[4626]: I0223 08:22:43.982626 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:22:43 crc kubenswrapper[4626]: E0223 08:22:43.983761 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:22:55 crc kubenswrapper[4626]: I0223 08:22:55.981934 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:22:56 crc kubenswrapper[4626]: I0223 08:22:56.497142 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"2204b7b7c21450520bdcbd6982ef9e7b23334ad7a2e5630fb96afb1cebf8aef7"} Feb 23 08:23:37 crc kubenswrapper[4626]: E0223 08:23:37.569909 4626 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.58:33122->192.168.26.58:46805: write tcp 192.168.26.58:33122->192.168.26.58:46805: write: broken pipe Feb 23 08:25:25 crc kubenswrapper[4626]: I0223 08:25:25.686041 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:25:25 crc kubenswrapper[4626]: I0223 08:25:25.686668 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:25:55 crc kubenswrapper[4626]: I0223 08:25:55.685373 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:25:55 crc kubenswrapper[4626]: I0223 08:25:55.686047 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:26:25 crc kubenswrapper[4626]: I0223 08:26:25.685157 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:26:25 crc kubenswrapper[4626]: I0223 08:26:25.685910 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:26:25 crc kubenswrapper[4626]: I0223 08:26:25.685980 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:26:25 crc kubenswrapper[4626]: I0223 08:26:25.686952 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2204b7b7c21450520bdcbd6982ef9e7b23334ad7a2e5630fb96afb1cebf8aef7"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:26:25 crc kubenswrapper[4626]: I0223 08:26:25.687022 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://2204b7b7c21450520bdcbd6982ef9e7b23334ad7a2e5630fb96afb1cebf8aef7" gracePeriod=600 Feb 23 08:26:26 crc kubenswrapper[4626]: I0223 08:26:26.623189 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="2204b7b7c21450520bdcbd6982ef9e7b23334ad7a2e5630fb96afb1cebf8aef7" exitCode=0 Feb 23 08:26:26 crc kubenswrapper[4626]: I0223 08:26:26.623257 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"2204b7b7c21450520bdcbd6982ef9e7b23334ad7a2e5630fb96afb1cebf8aef7"} Feb 23 08:26:26 crc kubenswrapper[4626]: I0223 08:26:26.623687 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92"} Feb 23 08:26:26 crc kubenswrapper[4626]: I0223 08:26:26.623712 4626 scope.go:117] "RemoveContainer" containerID="95acaae366d5db70660840c2f3054e8c30909a3ded010c58bde41948c6aaca22" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.997728 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sbnbn"] Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999180 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999430 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999465 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="extract-utilities" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999474 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="extract-utilities" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999490 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="extract-content" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999548 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="extract-content" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999565 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999571 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999580 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999586 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999591 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="extract-utilities" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999598 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="extract-utilities" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999609 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="extract-utilities" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999615 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="extract-utilities" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999623 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="extract-content" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999629 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="extract-content" Feb 23 08:26:45 crc kubenswrapper[4626]: E0223 08:26:45.999641 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="extract-content" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999648 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="extract-content" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999859 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="9698c50c-444e-417f-b1f5-ff74c10b5a5d" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999875 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="39929d12-fba9-4814-b489-b95352112bca" containerName="registry-server" Feb 23 08:26:45 crc kubenswrapper[4626]: I0223 08:26:45.999885 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e009afb-3d39-4f5c-b7fd-89d27f8e8e21" containerName="registry-server" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.001269 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.014110 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sbnbn"] Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.095526 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-utilities\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.096077 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrz2l\" (UniqueName: \"kubernetes.io/projected/45d2b4cb-723b-49b5-a685-51f27c0ff07b-kube-api-access-jrz2l\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.096329 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-catalog-content\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.199611 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-utilities\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.199963 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrz2l\" (UniqueName: \"kubernetes.io/projected/45d2b4cb-723b-49b5-a685-51f27c0ff07b-kube-api-access-jrz2l\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.200052 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-catalog-content\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.200203 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-utilities\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.200561 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-catalog-content\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.226016 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrz2l\" (UniqueName: \"kubernetes.io/projected/45d2b4cb-723b-49b5-a685-51f27c0ff07b-kube-api-access-jrz2l\") pod \"certified-operators-sbnbn\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.316730 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:46 crc kubenswrapper[4626]: I0223 08:26:46.919095 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sbnbn"] Feb 23 08:26:47 crc kubenswrapper[4626]: I0223 08:26:47.848629 4626 generic.go:334] "Generic (PLEG): container finished" podID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerID="f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363" exitCode=0 Feb 23 08:26:47 crc kubenswrapper[4626]: I0223 08:26:47.848716 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerDied","Data":"f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363"} Feb 23 08:26:47 crc kubenswrapper[4626]: I0223 08:26:47.849205 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerStarted","Data":"1172b0665d4325313bdb25d31f9667adfddd10524f32e40882293ab342ec2add"} Feb 23 08:26:47 crc kubenswrapper[4626]: I0223 08:26:47.854200 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:26:48 crc kubenswrapper[4626]: I0223 08:26:48.862549 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerStarted","Data":"d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa"} Feb 23 08:26:49 crc kubenswrapper[4626]: E0223 08:26:49.762830 4626 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.58:57086->192.168.26.58:46805: write tcp 192.168.26.58:57086->192.168.26.58:46805: write: connection reset by peer Feb 23 08:26:49 crc kubenswrapper[4626]: I0223 08:26:49.884024 4626 generic.go:334] "Generic (PLEG): container finished" podID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerID="d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa" exitCode=0 Feb 23 08:26:49 crc kubenswrapper[4626]: I0223 08:26:49.884085 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerDied","Data":"d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa"} Feb 23 08:26:50 crc kubenswrapper[4626]: I0223 08:26:50.901885 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerStarted","Data":"1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240"} Feb 23 08:26:50 crc kubenswrapper[4626]: I0223 08:26:50.934186 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sbnbn" podStartSLOduration=3.406219153 podStartE2EDuration="5.934165435s" podCreationTimestamp="2026-02-23 08:26:45 +0000 UTC" firstStartedPulling="2026-02-23 08:26:47.851827076 +0000 UTC m=+6360.191156342" lastFinishedPulling="2026-02-23 08:26:50.379773359 +0000 UTC m=+6362.719102624" observedRunningTime="2026-02-23 08:26:50.921776927 +0000 UTC m=+6363.261106192" watchObservedRunningTime="2026-02-23 08:26:50.934165435 +0000 UTC m=+6363.273494701" Feb 23 08:26:56 crc kubenswrapper[4626]: I0223 08:26:56.316952 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:56 crc kubenswrapper[4626]: I0223 08:26:56.317779 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:26:57 crc kubenswrapper[4626]: I0223 08:26:57.361803 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sbnbn" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="registry-server" probeResult="failure" output=< Feb 23 08:26:57 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:26:57 crc kubenswrapper[4626]: > Feb 23 08:27:06 crc kubenswrapper[4626]: I0223 08:27:06.451050 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:27:06 crc kubenswrapper[4626]: I0223 08:27:06.585166 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:27:06 crc kubenswrapper[4626]: I0223 08:27:06.712252 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sbnbn"] Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.074173 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sbnbn" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="registry-server" containerID="cri-o://1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240" gracePeriod=2 Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.661432 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.776843 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrz2l\" (UniqueName: \"kubernetes.io/projected/45d2b4cb-723b-49b5-a685-51f27c0ff07b-kube-api-access-jrz2l\") pod \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.777025 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-catalog-content\") pod \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.777185 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-utilities\") pod \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\" (UID: \"45d2b4cb-723b-49b5-a685-51f27c0ff07b\") " Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.782893 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-utilities" (OuterVolumeSpecName: "utilities") pod "45d2b4cb-723b-49b5-a685-51f27c0ff07b" (UID: "45d2b4cb-723b-49b5-a685-51f27c0ff07b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.786771 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d2b4cb-723b-49b5-a685-51f27c0ff07b-kube-api-access-jrz2l" (OuterVolumeSpecName: "kube-api-access-jrz2l") pod "45d2b4cb-723b-49b5-a685-51f27c0ff07b" (UID: "45d2b4cb-723b-49b5-a685-51f27c0ff07b"). InnerVolumeSpecName "kube-api-access-jrz2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.832886 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45d2b4cb-723b-49b5-a685-51f27c0ff07b" (UID: "45d2b4cb-723b-49b5-a685-51f27c0ff07b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.882742 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrz2l\" (UniqueName: \"kubernetes.io/projected/45d2b4cb-723b-49b5-a685-51f27c0ff07b-kube-api-access-jrz2l\") on node \"crc\" DevicePath \"\"" Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.882969 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:27:08 crc kubenswrapper[4626]: I0223 08:27:08.883041 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d2b4cb-723b-49b5-a685-51f27c0ff07b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.094544 4626 generic.go:334] "Generic (PLEG): container finished" podID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerID="1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240" exitCode=0 Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.094618 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerDied","Data":"1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240"} Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.094665 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbnbn" event={"ID":"45d2b4cb-723b-49b5-a685-51f27c0ff07b","Type":"ContainerDied","Data":"1172b0665d4325313bdb25d31f9667adfddd10524f32e40882293ab342ec2add"} Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.094693 4626 scope.go:117] "RemoveContainer" containerID="1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.094886 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbnbn" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.127354 4626 scope.go:117] "RemoveContainer" containerID="d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.138572 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sbnbn"] Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.146864 4626 scope.go:117] "RemoveContainer" containerID="f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.147592 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sbnbn"] Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.189293 4626 scope.go:117] "RemoveContainer" containerID="1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240" Feb 23 08:27:09 crc kubenswrapper[4626]: E0223 08:27:09.191339 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240\": container with ID starting with 1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240 not found: ID does not exist" containerID="1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.191999 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240"} err="failed to get container status \"1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240\": rpc error: code = NotFound desc = could not find container \"1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240\": container with ID starting with 1bcbdd88a13ae45ba138d34c073b62ba4244a9e3d82a165b3b805f5d53c1d240 not found: ID does not exist" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.192048 4626 scope.go:117] "RemoveContainer" containerID="d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa" Feb 23 08:27:09 crc kubenswrapper[4626]: E0223 08:27:09.192505 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa\": container with ID starting with d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa not found: ID does not exist" containerID="d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.192532 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa"} err="failed to get container status \"d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa\": rpc error: code = NotFound desc = could not find container \"d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa\": container with ID starting with d39d6fe52ff3dd044343bb467083cc26698072e422968e657eb5717fbbfb9ffa not found: ID does not exist" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.192547 4626 scope.go:117] "RemoveContainer" containerID="f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363" Feb 23 08:27:09 crc kubenswrapper[4626]: E0223 08:27:09.192922 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363\": container with ID starting with f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363 not found: ID does not exist" containerID="f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.193016 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363"} err="failed to get container status \"f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363\": rpc error: code = NotFound desc = could not find container \"f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363\": container with ID starting with f076d19cb40de4ac0f92f2627d923527b5d0b14d056b4aee30f71f56e03e7363 not found: ID does not exist" Feb 23 08:27:09 crc kubenswrapper[4626]: I0223 08:27:09.996931 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" path="/var/lib/kubelet/pods/45d2b4cb-723b-49b5-a685-51f27c0ff07b/volumes" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.746934 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b85dp"] Feb 23 08:27:15 crc kubenswrapper[4626]: E0223 08:27:15.747952 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="extract-utilities" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.747969 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="extract-utilities" Feb 23 08:27:15 crc kubenswrapper[4626]: E0223 08:27:15.747982 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="registry-server" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.747989 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="registry-server" Feb 23 08:27:15 crc kubenswrapper[4626]: E0223 08:27:15.748022 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="extract-content" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.748029 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="extract-content" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.748222 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d2b4cb-723b-49b5-a685-51f27c0ff07b" containerName="registry-server" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.749594 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.761642 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b85dp"] Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.841568 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6hz\" (UniqueName: \"kubernetes.io/projected/26f82e7a-f4d4-4162-bee5-02f8cd756285-kube-api-access-sb6hz\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.841776 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-utilities\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.841841 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-catalog-content\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.943911 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6hz\" (UniqueName: \"kubernetes.io/projected/26f82e7a-f4d4-4162-bee5-02f8cd756285-kube-api-access-sb6hz\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.944106 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-utilities\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.944587 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-utilities\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.944672 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-catalog-content\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.944894 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-catalog-content\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:15 crc kubenswrapper[4626]: I0223 08:27:15.970168 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6hz\" (UniqueName: \"kubernetes.io/projected/26f82e7a-f4d4-4162-bee5-02f8cd756285-kube-api-access-sb6hz\") pod \"community-operators-b85dp\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:16 crc kubenswrapper[4626]: I0223 08:27:16.068189 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:16 crc kubenswrapper[4626]: I0223 08:27:16.584723 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b85dp"] Feb 23 08:27:17 crc kubenswrapper[4626]: I0223 08:27:17.175825 4626 generic.go:334] "Generic (PLEG): container finished" podID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerID="a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b" exitCode=0 Feb 23 08:27:17 crc kubenswrapper[4626]: I0223 08:27:17.175921 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerDied","Data":"a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b"} Feb 23 08:27:17 crc kubenswrapper[4626]: I0223 08:27:17.176625 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerStarted","Data":"9cb7e3f20f20ec4dde691a33dd486f759eb1ed419df6e921570c0e11ac446b11"} Feb 23 08:27:18 crc kubenswrapper[4626]: I0223 08:27:18.188717 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerStarted","Data":"0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878"} Feb 23 08:27:19 crc kubenswrapper[4626]: I0223 08:27:19.200872 4626 generic.go:334] "Generic (PLEG): container finished" podID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerID="0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878" exitCode=0 Feb 23 08:27:19 crc kubenswrapper[4626]: I0223 08:27:19.201064 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerDied","Data":"0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878"} Feb 23 08:27:20 crc kubenswrapper[4626]: I0223 08:27:20.222212 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerStarted","Data":"84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d"} Feb 23 08:27:20 crc kubenswrapper[4626]: I0223 08:27:20.248538 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b85dp" podStartSLOduration=2.737071317 podStartE2EDuration="5.248519674s" podCreationTimestamp="2026-02-23 08:27:15 +0000 UTC" firstStartedPulling="2026-02-23 08:27:17.180077706 +0000 UTC m=+6389.519406972" lastFinishedPulling="2026-02-23 08:27:19.691526063 +0000 UTC m=+6392.030855329" observedRunningTime="2026-02-23 08:27:20.244006214 +0000 UTC m=+6392.583335481" watchObservedRunningTime="2026-02-23 08:27:20.248519674 +0000 UTC m=+6392.587848941" Feb 23 08:27:26 crc kubenswrapper[4626]: I0223 08:27:26.069445 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:26 crc kubenswrapper[4626]: I0223 08:27:26.070074 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:26 crc kubenswrapper[4626]: I0223 08:27:26.119125 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:26 crc kubenswrapper[4626]: I0223 08:27:26.361777 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:26 crc kubenswrapper[4626]: I0223 08:27:26.432256 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b85dp"] Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.327557 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b85dp" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="registry-server" containerID="cri-o://84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d" gracePeriod=2 Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.880650 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.889540 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-catalog-content\") pod \"26f82e7a-f4d4-4162-bee5-02f8cd756285\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.889689 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-utilities\") pod \"26f82e7a-f4d4-4162-bee5-02f8cd756285\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.889899 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6hz\" (UniqueName: \"kubernetes.io/projected/26f82e7a-f4d4-4162-bee5-02f8cd756285-kube-api-access-sb6hz\") pod \"26f82e7a-f4d4-4162-bee5-02f8cd756285\" (UID: \"26f82e7a-f4d4-4162-bee5-02f8cd756285\") " Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.890173 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-utilities" (OuterVolumeSpecName: "utilities") pod "26f82e7a-f4d4-4162-bee5-02f8cd756285" (UID: "26f82e7a-f4d4-4162-bee5-02f8cd756285"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.890796 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.897967 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f82e7a-f4d4-4162-bee5-02f8cd756285-kube-api-access-sb6hz" (OuterVolumeSpecName: "kube-api-access-sb6hz") pod "26f82e7a-f4d4-4162-bee5-02f8cd756285" (UID: "26f82e7a-f4d4-4162-bee5-02f8cd756285"). InnerVolumeSpecName "kube-api-access-sb6hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.938240 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26f82e7a-f4d4-4162-bee5-02f8cd756285" (UID: "26f82e7a-f4d4-4162-bee5-02f8cd756285"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.992589 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6hz\" (UniqueName: \"kubernetes.io/projected/26f82e7a-f4d4-4162-bee5-02f8cd756285-kube-api-access-sb6hz\") on node \"crc\" DevicePath \"\"" Feb 23 08:27:28 crc kubenswrapper[4626]: I0223 08:27:28.992617 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f82e7a-f4d4-4162-bee5-02f8cd756285-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.336594 4626 generic.go:334] "Generic (PLEG): container finished" podID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerID="84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d" exitCode=0 Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.336679 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b85dp" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.336700 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerDied","Data":"84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d"} Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.337010 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b85dp" event={"ID":"26f82e7a-f4d4-4162-bee5-02f8cd756285","Type":"ContainerDied","Data":"9cb7e3f20f20ec4dde691a33dd486f759eb1ed419df6e921570c0e11ac446b11"} Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.337032 4626 scope.go:117] "RemoveContainer" containerID="84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.360678 4626 scope.go:117] "RemoveContainer" containerID="0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.368528 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b85dp"] Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.374087 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b85dp"] Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.388978 4626 scope.go:117] "RemoveContainer" containerID="a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.436742 4626 scope.go:117] "RemoveContainer" containerID="84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d" Feb 23 08:27:29 crc kubenswrapper[4626]: E0223 08:27:29.437453 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d\": container with ID starting with 84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d not found: ID does not exist" containerID="84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.437571 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d"} err="failed to get container status \"84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d\": rpc error: code = NotFound desc = could not find container \"84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d\": container with ID starting with 84609ec4a298f24dbc062ddd2167cfde40d5e09eeaac76de6e4c4db7d6491d0d not found: ID does not exist" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.437630 4626 scope.go:117] "RemoveContainer" containerID="0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878" Feb 23 08:27:29 crc kubenswrapper[4626]: E0223 08:27:29.438026 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878\": container with ID starting with 0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878 not found: ID does not exist" containerID="0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.438068 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878"} err="failed to get container status \"0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878\": rpc error: code = NotFound desc = could not find container \"0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878\": container with ID starting with 0445a94e6951eefb164c6848de7f648f9ebacc0220732d397a59555322d4a878 not found: ID does not exist" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.438098 4626 scope.go:117] "RemoveContainer" containerID="a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b" Feb 23 08:27:29 crc kubenswrapper[4626]: E0223 08:27:29.438353 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b\": container with ID starting with a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b not found: ID does not exist" containerID="a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.438399 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b"} err="failed to get container status \"a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b\": rpc error: code = NotFound desc = could not find container \"a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b\": container with ID starting with a5a2efe621b47e4954beea049310f987591369c68e35381684bbceb950afd47b not found: ID does not exist" Feb 23 08:27:29 crc kubenswrapper[4626]: I0223 08:27:29.992178 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" path="/var/lib/kubelet/pods/26f82e7a-f4d4-4162-bee5-02f8cd756285/volumes" Feb 23 08:28:25 crc kubenswrapper[4626]: I0223 08:28:25.685085 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:28:25 crc kubenswrapper[4626]: I0223 08:28:25.685783 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:28:55 crc kubenswrapper[4626]: I0223 08:28:55.685065 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:28:55 crc kubenswrapper[4626]: I0223 08:28:55.686463 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:29:25 crc kubenswrapper[4626]: I0223 08:29:25.685577 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:29:25 crc kubenswrapper[4626]: I0223 08:29:25.686169 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:29:25 crc kubenswrapper[4626]: I0223 08:29:25.686226 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:29:25 crc kubenswrapper[4626]: I0223 08:29:25.687255 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:29:25 crc kubenswrapper[4626]: I0223 08:29:25.687312 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" gracePeriod=600 Feb 23 08:29:25 crc kubenswrapper[4626]: E0223 08:29:25.813762 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:29:26 crc kubenswrapper[4626]: I0223 08:29:26.458173 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" exitCode=0 Feb 23 08:29:26 crc kubenswrapper[4626]: I0223 08:29:26.458339 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92"} Feb 23 08:29:26 crc kubenswrapper[4626]: I0223 08:29:26.458673 4626 scope.go:117] "RemoveContainer" containerID="2204b7b7c21450520bdcbd6982ef9e7b23334ad7a2e5630fb96afb1cebf8aef7" Feb 23 08:29:26 crc kubenswrapper[4626]: I0223 08:29:26.459331 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:29:26 crc kubenswrapper[4626]: E0223 08:29:26.459699 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:29:39 crc kubenswrapper[4626]: I0223 08:29:39.983889 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:29:39 crc kubenswrapper[4626]: E0223 08:29:39.985336 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:29:50 crc kubenswrapper[4626]: I0223 08:29:50.982686 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:29:50 crc kubenswrapper[4626]: E0223 08:29:50.983758 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.160030 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv"] Feb 23 08:30:00 crc kubenswrapper[4626]: E0223 08:30:00.161147 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="extract-utilities" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.161169 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="extract-utilities" Feb 23 08:30:00 crc kubenswrapper[4626]: E0223 08:30:00.161208 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="registry-server" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.161216 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="registry-server" Feb 23 08:30:00 crc kubenswrapper[4626]: E0223 08:30:00.161230 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="extract-content" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.161237 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="extract-content" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.162849 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f82e7a-f4d4-4162-bee5-02f8cd756285" containerName="registry-server" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.163732 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.170208 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv"] Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.175191 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.178207 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.184211 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr487\" (UniqueName: \"kubernetes.io/projected/865947e0-9c17-4817-89a3-2257f5683244-kube-api-access-cr487\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.184326 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/865947e0-9c17-4817-89a3-2257f5683244-config-volume\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.184363 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/865947e0-9c17-4817-89a3-2257f5683244-secret-volume\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.287025 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr487\" (UniqueName: \"kubernetes.io/projected/865947e0-9c17-4817-89a3-2257f5683244-kube-api-access-cr487\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.287258 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/865947e0-9c17-4817-89a3-2257f5683244-config-volume\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.287365 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/865947e0-9c17-4817-89a3-2257f5683244-secret-volume\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.288178 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/865947e0-9c17-4817-89a3-2257f5683244-config-volume\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.295127 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/865947e0-9c17-4817-89a3-2257f5683244-secret-volume\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.305253 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr487\" (UniqueName: \"kubernetes.io/projected/865947e0-9c17-4817-89a3-2257f5683244-kube-api-access-cr487\") pod \"collect-profiles-29530590-qk2xv\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.494556 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:00 crc kubenswrapper[4626]: I0223 08:30:00.993055 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv"] Feb 23 08:30:01 crc kubenswrapper[4626]: W0223 08:30:01.018085 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865947e0_9c17_4817_89a3_2257f5683244.slice/crio-1b3ea7a8aadb0517896285d5a76076b34eab5c04a5a9c50d47b1f06a73be6566 WatchSource:0}: Error finding container 1b3ea7a8aadb0517896285d5a76076b34eab5c04a5a9c50d47b1f06a73be6566: Status 404 returned error can't find the container with id 1b3ea7a8aadb0517896285d5a76076b34eab5c04a5a9c50d47b1f06a73be6566 Feb 23 08:30:01 crc kubenswrapper[4626]: I0223 08:30:01.788487 4626 generic.go:334] "Generic (PLEG): container finished" podID="865947e0-9c17-4817-89a3-2257f5683244" containerID="7dd92e3ddba97b0b3f2a161ca44b7039dc7848d8d778311ef09013e9b2781a58" exitCode=0 Feb 23 08:30:01 crc kubenswrapper[4626]: I0223 08:30:01.788721 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" event={"ID":"865947e0-9c17-4817-89a3-2257f5683244","Type":"ContainerDied","Data":"7dd92e3ddba97b0b3f2a161ca44b7039dc7848d8d778311ef09013e9b2781a58"} Feb 23 08:30:01 crc kubenswrapper[4626]: I0223 08:30:01.788785 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" event={"ID":"865947e0-9c17-4817-89a3-2257f5683244","Type":"ContainerStarted","Data":"1b3ea7a8aadb0517896285d5a76076b34eab5c04a5a9c50d47b1f06a73be6566"} Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.106535 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.147780 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/865947e0-9c17-4817-89a3-2257f5683244-secret-volume\") pod \"865947e0-9c17-4817-89a3-2257f5683244\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.148001 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr487\" (UniqueName: \"kubernetes.io/projected/865947e0-9c17-4817-89a3-2257f5683244-kube-api-access-cr487\") pod \"865947e0-9c17-4817-89a3-2257f5683244\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.148113 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/865947e0-9c17-4817-89a3-2257f5683244-config-volume\") pod \"865947e0-9c17-4817-89a3-2257f5683244\" (UID: \"865947e0-9c17-4817-89a3-2257f5683244\") " Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.148656 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865947e0-9c17-4817-89a3-2257f5683244-config-volume" (OuterVolumeSpecName: "config-volume") pod "865947e0-9c17-4817-89a3-2257f5683244" (UID: "865947e0-9c17-4817-89a3-2257f5683244"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.148811 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/865947e0-9c17-4817-89a3-2257f5683244-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.166182 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865947e0-9c17-4817-89a3-2257f5683244-kube-api-access-cr487" (OuterVolumeSpecName: "kube-api-access-cr487") pod "865947e0-9c17-4817-89a3-2257f5683244" (UID: "865947e0-9c17-4817-89a3-2257f5683244"). InnerVolumeSpecName "kube-api-access-cr487". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.169586 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865947e0-9c17-4817-89a3-2257f5683244-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "865947e0-9c17-4817-89a3-2257f5683244" (UID: "865947e0-9c17-4817-89a3-2257f5683244"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.250816 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/865947e0-9c17-4817-89a3-2257f5683244-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.250846 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr487\" (UniqueName: \"kubernetes.io/projected/865947e0-9c17-4817-89a3-2257f5683244-kube-api-access-cr487\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.805789 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" event={"ID":"865947e0-9c17-4817-89a3-2257f5683244","Type":"ContainerDied","Data":"1b3ea7a8aadb0517896285d5a76076b34eab5c04a5a9c50d47b1f06a73be6566"} Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.806066 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b3ea7a8aadb0517896285d5a76076b34eab5c04a5a9c50d47b1f06a73be6566" Feb 23 08:30:03 crc kubenswrapper[4626]: I0223 08:30:03.806079 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv" Feb 23 08:30:04 crc kubenswrapper[4626]: I0223 08:30:04.194182 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m"] Feb 23 08:30:04 crc kubenswrapper[4626]: I0223 08:30:04.202238 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-mp84m"] Feb 23 08:30:05 crc kubenswrapper[4626]: I0223 08:30:05.982158 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:30:05 crc kubenswrapper[4626]: E0223 08:30:05.982959 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:30:05 crc kubenswrapper[4626]: I0223 08:30:05.993270 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a56899-cb3c-4da0-bcb1-af450262d173" path="/var/lib/kubelet/pods/b5a56899-cb3c-4da0-bcb1-af450262d173/volumes" Feb 23 08:30:20 crc kubenswrapper[4626]: I0223 08:30:20.981882 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:30:20 crc kubenswrapper[4626]: E0223 08:30:20.982741 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:30:23 crc kubenswrapper[4626]: I0223 08:30:23.757340 4626 scope.go:117] "RemoveContainer" containerID="020947f1a4c59d9e51ead17b7194047dd16164f91dc7931793256c93002e38ba" Feb 23 08:30:32 crc kubenswrapper[4626]: I0223 08:30:32.982585 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:30:32 crc kubenswrapper[4626]: E0223 08:30:32.983467 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:30:47 crc kubenswrapper[4626]: I0223 08:30:47.987322 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:30:47 crc kubenswrapper[4626]: E0223 08:30:47.988051 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:30:59 crc kubenswrapper[4626]: I0223 08:30:59.982850 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:30:59 crc kubenswrapper[4626]: E0223 08:30:59.983482 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:31:10 crc kubenswrapper[4626]: I0223 08:31:10.982091 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:31:10 crc kubenswrapper[4626]: E0223 08:31:10.983030 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:31:22 crc kubenswrapper[4626]: I0223 08:31:22.985824 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:31:22 crc kubenswrapper[4626]: E0223 08:31:22.986669 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.014879 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xw54h"] Feb 23 08:31:24 crc kubenswrapper[4626]: E0223 08:31:24.015692 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865947e0-9c17-4817-89a3-2257f5683244" containerName="collect-profiles" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.015708 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="865947e0-9c17-4817-89a3-2257f5683244" containerName="collect-profiles" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.015980 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="865947e0-9c17-4817-89a3-2257f5683244" containerName="collect-profiles" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.019209 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.038848 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xw54h"] Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.134447 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-catalog-content\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.134570 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-utilities\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.134641 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jtwv\" (UniqueName: \"kubernetes.io/projected/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-kube-api-access-4jtwv\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.237052 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jtwv\" (UniqueName: \"kubernetes.io/projected/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-kube-api-access-4jtwv\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.237759 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-catalog-content\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.237975 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-utilities\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.238237 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-catalog-content\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.238450 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-utilities\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.265456 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jtwv\" (UniqueName: \"kubernetes.io/projected/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-kube-api-access-4jtwv\") pod \"redhat-operators-xw54h\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.344072 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:24 crc kubenswrapper[4626]: I0223 08:31:24.855899 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xw54h"] Feb 23 08:31:25 crc kubenswrapper[4626]: I0223 08:31:25.511128 4626 generic.go:334] "Generic (PLEG): container finished" podID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerID="5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95" exitCode=0 Feb 23 08:31:25 crc kubenswrapper[4626]: I0223 08:31:25.511342 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerDied","Data":"5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95"} Feb 23 08:31:25 crc kubenswrapper[4626]: I0223 08:31:25.512811 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerStarted","Data":"5ee7b222fc436cc2238a44cd8b42cf8787361dd6fbda0a1a71467e5812a355d7"} Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.414994 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qb8n"] Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.417423 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.446895 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qb8n"] Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.515372 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-catalog-content\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.520937 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lcw\" (UniqueName: \"kubernetes.io/projected/dd530adf-4d5d-44d8-abd3-519953baa1ca-kube-api-access-l8lcw\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.521150 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-utilities\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.527011 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerStarted","Data":"fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031"} Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.624047 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lcw\" (UniqueName: \"kubernetes.io/projected/dd530adf-4d5d-44d8-abd3-519953baa1ca-kube-api-access-l8lcw\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.624207 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-utilities\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.624398 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-catalog-content\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.624688 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-utilities\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.624749 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-catalog-content\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.648177 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lcw\" (UniqueName: \"kubernetes.io/projected/dd530adf-4d5d-44d8-abd3-519953baa1ca-kube-api-access-l8lcw\") pod \"redhat-marketplace-9qb8n\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:26 crc kubenswrapper[4626]: I0223 08:31:26.774683 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:27 crc kubenswrapper[4626]: I0223 08:31:27.338255 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qb8n"] Feb 23 08:31:27 crc kubenswrapper[4626]: I0223 08:31:27.538611 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerStarted","Data":"68a53f3558c7b925d22a4ad8e49a2029caa6762b4aa4dc1e9eb2c7a74cf8a754"} Feb 23 08:31:28 crc kubenswrapper[4626]: I0223 08:31:28.550254 4626 generic.go:334] "Generic (PLEG): container finished" podID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerID="442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53" exitCode=0 Feb 23 08:31:28 crc kubenswrapper[4626]: I0223 08:31:28.550398 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerDied","Data":"442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53"} Feb 23 08:31:29 crc kubenswrapper[4626]: I0223 08:31:29.564880 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerStarted","Data":"32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783"} Feb 23 08:31:29 crc kubenswrapper[4626]: I0223 08:31:29.567472 4626 generic.go:334] "Generic (PLEG): container finished" podID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerID="fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031" exitCode=0 Feb 23 08:31:29 crc kubenswrapper[4626]: I0223 08:31:29.567554 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerDied","Data":"fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031"} Feb 23 08:31:30 crc kubenswrapper[4626]: I0223 08:31:30.579460 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerStarted","Data":"cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717"} Feb 23 08:31:30 crc kubenswrapper[4626]: I0223 08:31:30.581004 4626 generic.go:334] "Generic (PLEG): container finished" podID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerID="32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783" exitCode=0 Feb 23 08:31:30 crc kubenswrapper[4626]: I0223 08:31:30.581053 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerDied","Data":"32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783"} Feb 23 08:31:30 crc kubenswrapper[4626]: I0223 08:31:30.616114 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xw54h" podStartSLOduration=3.083250213 podStartE2EDuration="7.6160896s" podCreationTimestamp="2026-02-23 08:31:23 +0000 UTC" firstStartedPulling="2026-02-23 08:31:25.514674453 +0000 UTC m=+6637.854003710" lastFinishedPulling="2026-02-23 08:31:30.047513831 +0000 UTC m=+6642.386843097" observedRunningTime="2026-02-23 08:31:30.600713849 +0000 UTC m=+6642.940043116" watchObservedRunningTime="2026-02-23 08:31:30.6160896 +0000 UTC m=+6642.955418866" Feb 23 08:31:31 crc kubenswrapper[4626]: I0223 08:31:31.592652 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerStarted","Data":"995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1"} Feb 23 08:31:31 crc kubenswrapper[4626]: I0223 08:31:31.622194 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qb8n" podStartSLOduration=2.983938867 podStartE2EDuration="5.622173415s" podCreationTimestamp="2026-02-23 08:31:26 +0000 UTC" firstStartedPulling="2026-02-23 08:31:28.552955775 +0000 UTC m=+6640.892285041" lastFinishedPulling="2026-02-23 08:31:31.191190322 +0000 UTC m=+6643.530519589" observedRunningTime="2026-02-23 08:31:31.615343297 +0000 UTC m=+6643.954672564" watchObservedRunningTime="2026-02-23 08:31:31.622173415 +0000 UTC m=+6643.961502681" Feb 23 08:31:34 crc kubenswrapper[4626]: I0223 08:31:34.344807 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:34 crc kubenswrapper[4626]: I0223 08:31:34.346111 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:35 crc kubenswrapper[4626]: I0223 08:31:35.397764 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xw54h" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="registry-server" probeResult="failure" output=< Feb 23 08:31:35 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:31:35 crc kubenswrapper[4626]: > Feb 23 08:31:36 crc kubenswrapper[4626]: I0223 08:31:36.774938 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:36 crc kubenswrapper[4626]: I0223 08:31:36.776800 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:36 crc kubenswrapper[4626]: I0223 08:31:36.824881 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:37 crc kubenswrapper[4626]: I0223 08:31:37.755859 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:37 crc kubenswrapper[4626]: I0223 08:31:37.848896 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qb8n"] Feb 23 08:31:38 crc kubenswrapper[4626]: I0223 08:31:38.983650 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:31:38 crc kubenswrapper[4626]: E0223 08:31:38.984348 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:31:39 crc kubenswrapper[4626]: I0223 08:31:39.677339 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qb8n" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="registry-server" containerID="cri-o://995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1" gracePeriod=2 Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.219764 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.329013 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-utilities\") pod \"dd530adf-4d5d-44d8-abd3-519953baa1ca\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.329065 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-catalog-content\") pod \"dd530adf-4d5d-44d8-abd3-519953baa1ca\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.329222 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8lcw\" (UniqueName: \"kubernetes.io/projected/dd530adf-4d5d-44d8-abd3-519953baa1ca-kube-api-access-l8lcw\") pod \"dd530adf-4d5d-44d8-abd3-519953baa1ca\" (UID: \"dd530adf-4d5d-44d8-abd3-519953baa1ca\") " Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.330306 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-utilities" (OuterVolumeSpecName: "utilities") pod "dd530adf-4d5d-44d8-abd3-519953baa1ca" (UID: "dd530adf-4d5d-44d8-abd3-519953baa1ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.331200 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.338378 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd530adf-4d5d-44d8-abd3-519953baa1ca-kube-api-access-l8lcw" (OuterVolumeSpecName: "kube-api-access-l8lcw") pod "dd530adf-4d5d-44d8-abd3-519953baa1ca" (UID: "dd530adf-4d5d-44d8-abd3-519953baa1ca"). InnerVolumeSpecName "kube-api-access-l8lcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.348720 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd530adf-4d5d-44d8-abd3-519953baa1ca" (UID: "dd530adf-4d5d-44d8-abd3-519953baa1ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.434461 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd530adf-4d5d-44d8-abd3-519953baa1ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.434853 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8lcw\" (UniqueName: \"kubernetes.io/projected/dd530adf-4d5d-44d8-abd3-519953baa1ca-kube-api-access-l8lcw\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.688764 4626 generic.go:334] "Generic (PLEG): container finished" podID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerID="995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1" exitCode=0 Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.688889 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qb8n" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.688857 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerDied","Data":"995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1"} Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.689054 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qb8n" event={"ID":"dd530adf-4d5d-44d8-abd3-519953baa1ca","Type":"ContainerDied","Data":"68a53f3558c7b925d22a4ad8e49a2029caa6762b4aa4dc1e9eb2c7a74cf8a754"} Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.689083 4626 scope.go:117] "RemoveContainer" containerID="995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.726089 4626 scope.go:117] "RemoveContainer" containerID="32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.730084 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qb8n"] Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.745601 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qb8n"] Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.752403 4626 scope.go:117] "RemoveContainer" containerID="442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.814963 4626 scope.go:117] "RemoveContainer" containerID="995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1" Feb 23 08:31:40 crc kubenswrapper[4626]: E0223 08:31:40.815601 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1\": container with ID starting with 995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1 not found: ID does not exist" containerID="995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.815643 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1"} err="failed to get container status \"995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1\": rpc error: code = NotFound desc = could not find container \"995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1\": container with ID starting with 995b4714569b961fd3bbd2741f42cbf89d9e08a5a68fe6ae28d077e01e0168a1 not found: ID does not exist" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.815669 4626 scope.go:117] "RemoveContainer" containerID="32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783" Feb 23 08:31:40 crc kubenswrapper[4626]: E0223 08:31:40.816064 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783\": container with ID starting with 32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783 not found: ID does not exist" containerID="32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.816108 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783"} err="failed to get container status \"32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783\": rpc error: code = NotFound desc = could not find container \"32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783\": container with ID starting with 32f4bf6ffb5bd22ef910e2cdd8e148a48383f4dc8821269370aea2b2dc8ee783 not found: ID does not exist" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.816141 4626 scope.go:117] "RemoveContainer" containerID="442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53" Feb 23 08:31:40 crc kubenswrapper[4626]: E0223 08:31:40.816615 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53\": container with ID starting with 442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53 not found: ID does not exist" containerID="442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53" Feb 23 08:31:40 crc kubenswrapper[4626]: I0223 08:31:40.816667 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53"} err="failed to get container status \"442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53\": rpc error: code = NotFound desc = could not find container \"442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53\": container with ID starting with 442761938f352147782685fd6a8c84f2db97462b3f47d9a3548ade0278e04f53 not found: ID does not exist" Feb 23 08:31:41 crc kubenswrapper[4626]: I0223 08:31:41.997421 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" path="/var/lib/kubelet/pods/dd530adf-4d5d-44d8-abd3-519953baa1ca/volumes" Feb 23 08:31:45 crc kubenswrapper[4626]: I0223 08:31:45.385143 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xw54h" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="registry-server" probeResult="failure" output=< Feb 23 08:31:45 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:31:45 crc kubenswrapper[4626]: > Feb 23 08:31:50 crc kubenswrapper[4626]: I0223 08:31:50.983589 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:31:50 crc kubenswrapper[4626]: E0223 08:31:50.984933 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:31:54 crc kubenswrapper[4626]: I0223 08:31:54.385641 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:54 crc kubenswrapper[4626]: I0223 08:31:54.428334 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:55 crc kubenswrapper[4626]: I0223 08:31:55.387005 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xw54h"] Feb 23 08:31:55 crc kubenswrapper[4626]: I0223 08:31:55.857570 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xw54h" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="registry-server" containerID="cri-o://cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717" gracePeriod=2 Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.298584 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.336312 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jtwv\" (UniqueName: \"kubernetes.io/projected/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-kube-api-access-4jtwv\") pod \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.336604 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-catalog-content\") pod \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.336811 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-utilities\") pod \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\" (UID: \"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9\") " Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.338671 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-utilities" (OuterVolumeSpecName: "utilities") pod "e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" (UID: "e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.354108 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-kube-api-access-4jtwv" (OuterVolumeSpecName: "kube-api-access-4jtwv") pod "e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" (UID: "e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9"). InnerVolumeSpecName "kube-api-access-4jtwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.436917 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" (UID: "e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.441802 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jtwv\" (UniqueName: \"kubernetes.io/projected/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-kube-api-access-4jtwv\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.442111 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.442124 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.869114 4626 generic.go:334] "Generic (PLEG): container finished" podID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerID="cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717" exitCode=0 Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.869248 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw54h" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.869306 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerDied","Data":"cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717"} Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.869392 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw54h" event={"ID":"e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9","Type":"ContainerDied","Data":"5ee7b222fc436cc2238a44cd8b42cf8787361dd6fbda0a1a71467e5812a355d7"} Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.869421 4626 scope.go:117] "RemoveContainer" containerID="cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.895067 4626 scope.go:117] "RemoveContainer" containerID="fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.905461 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xw54h"] Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.920833 4626 scope.go:117] "RemoveContainer" containerID="5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.924786 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xw54h"] Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.985488 4626 scope.go:117] "RemoveContainer" containerID="cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717" Feb 23 08:31:56 crc kubenswrapper[4626]: E0223 08:31:56.986848 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717\": container with ID starting with cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717 not found: ID does not exist" containerID="cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.986909 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717"} err="failed to get container status \"cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717\": rpc error: code = NotFound desc = could not find container \"cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717\": container with ID starting with cd530f44239ed25ded0584b24537993e1d6f4eecdeee1869c315accece2e5717 not found: ID does not exist" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.986945 4626 scope.go:117] "RemoveContainer" containerID="fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031" Feb 23 08:31:56 crc kubenswrapper[4626]: E0223 08:31:56.987682 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031\": container with ID starting with fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031 not found: ID does not exist" containerID="fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.987761 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031"} err="failed to get container status \"fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031\": rpc error: code = NotFound desc = could not find container \"fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031\": container with ID starting with fe4aafec73e02cc9b3802f9162d5cad27a2feb9854fef38f41d5437e1942a031 not found: ID does not exist" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.987806 4626 scope.go:117] "RemoveContainer" containerID="5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95" Feb 23 08:31:56 crc kubenswrapper[4626]: E0223 08:31:56.994218 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95\": container with ID starting with 5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95 not found: ID does not exist" containerID="5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95" Feb 23 08:31:56 crc kubenswrapper[4626]: I0223 08:31:56.994294 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95"} err="failed to get container status \"5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95\": rpc error: code = NotFound desc = could not find container \"5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95\": container with ID starting with 5cc8aef6ea683719cbe64cc9115a584f8c8bbd5563dcfbcb1d4f696f67ea1d95 not found: ID does not exist" Feb 23 08:31:57 crc kubenswrapper[4626]: E0223 08:31:57.113733 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c80f63_84da_4a35_8ea0_f9ad1c8e99a9.slice/crio-5ee7b222fc436cc2238a44cd8b42cf8787361dd6fbda0a1a71467e5812a355d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c80f63_84da_4a35_8ea0_f9ad1c8e99a9.slice\": RecentStats: unable to find data in memory cache]" Feb 23 08:31:57 crc kubenswrapper[4626]: I0223 08:31:57.991838 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" path="/var/lib/kubelet/pods/e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9/volumes" Feb 23 08:32:01 crc kubenswrapper[4626]: I0223 08:32:01.982303 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:32:01 crc kubenswrapper[4626]: E0223 08:32:01.983921 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:32:08 crc kubenswrapper[4626]: E0223 08:32:08.408258 4626 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.58:36874->192.168.26.58:46805: read tcp 192.168.26.58:36874->192.168.26.58:46805: read: connection reset by peer Feb 23 08:32:12 crc kubenswrapper[4626]: I0223 08:32:12.983378 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:32:12 crc kubenswrapper[4626]: E0223 08:32:12.984137 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:32:23 crc kubenswrapper[4626]: I0223 08:32:23.982693 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:32:23 crc kubenswrapper[4626]: E0223 08:32:23.983468 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:32:37 crc kubenswrapper[4626]: I0223 08:32:37.988741 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:32:37 crc kubenswrapper[4626]: E0223 08:32:37.989878 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:32:49 crc kubenswrapper[4626]: I0223 08:32:49.983369 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:32:49 crc kubenswrapper[4626]: E0223 08:32:49.984426 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:32:58 crc kubenswrapper[4626]: E0223 08:32:58.472831 4626 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.58:46138->192.168.26.58:46805: read tcp 192.168.26.58:46138->192.168.26.58:46805: read: connection reset by peer Feb 23 08:33:02 crc kubenswrapper[4626]: I0223 08:33:02.982696 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:33:02 crc kubenswrapper[4626]: E0223 08:33:02.983707 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:33:16 crc kubenswrapper[4626]: I0223 08:33:16.982516 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:33:16 crc kubenswrapper[4626]: E0223 08:33:16.985228 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:33:27 crc kubenswrapper[4626]: I0223 08:33:27.988332 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:33:27 crc kubenswrapper[4626]: E0223 08:33:27.989217 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:33:41 crc kubenswrapper[4626]: I0223 08:33:41.982027 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:33:41 crc kubenswrapper[4626]: E0223 08:33:41.982938 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:33:52 crc kubenswrapper[4626]: I0223 08:33:52.982154 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:33:52 crc kubenswrapper[4626]: E0223 08:33:52.983167 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:34:03 crc kubenswrapper[4626]: I0223 08:34:03.982098 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:34:03 crc kubenswrapper[4626]: E0223 08:34:03.982794 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:34:16 crc kubenswrapper[4626]: I0223 08:34:16.982793 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:34:16 crc kubenswrapper[4626]: E0223 08:34:16.983585 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:34:28 crc kubenswrapper[4626]: I0223 08:34:28.982143 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:34:30 crc kubenswrapper[4626]: I0223 08:34:30.160184 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"cd3e113e85afe4fed581bf2049c4e93279c124a62a6ae1b2c772b0831e838560"} Feb 23 08:36:55 crc kubenswrapper[4626]: I0223 08:36:55.685556 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:36:55 crc kubenswrapper[4626]: I0223 08:36:55.687269 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.714710 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8cdxj"] Feb 23 08:36:57 crc kubenswrapper[4626]: E0223 08:36:57.716556 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="extract-utilities" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.716823 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="extract-utilities" Feb 23 08:36:57 crc kubenswrapper[4626]: E0223 08:36:57.716844 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="registry-server" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.716852 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="registry-server" Feb 23 08:36:57 crc kubenswrapper[4626]: E0223 08:36:57.716876 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="extract-content" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.716881 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="extract-content" Feb 23 08:36:57 crc kubenswrapper[4626]: E0223 08:36:57.716901 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="registry-server" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.716907 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="registry-server" Feb 23 08:36:57 crc kubenswrapper[4626]: E0223 08:36:57.716918 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="extract-utilities" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.716923 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="extract-utilities" Feb 23 08:36:57 crc kubenswrapper[4626]: E0223 08:36:57.716934 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="extract-content" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.716942 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="extract-content" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.717643 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c80f63-84da-4a35-8ea0-f9ad1c8e99a9" containerName="registry-server" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.717675 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd530adf-4d5d-44d8-abd3-519953baa1ca" containerName="registry-server" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.721332 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.744816 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cdxj"] Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.909739 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-catalog-content\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.909869 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-utilities\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:57 crc kubenswrapper[4626]: I0223 08:36:57.910034 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wph5h\" (UniqueName: \"kubernetes.io/projected/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-kube-api-access-wph5h\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.011278 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-catalog-content\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.011398 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-utilities\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.011563 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wph5h\" (UniqueName: \"kubernetes.io/projected/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-kube-api-access-wph5h\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.012146 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-catalog-content\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.012740 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-utilities\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.035088 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wph5h\" (UniqueName: \"kubernetes.io/projected/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-kube-api-access-wph5h\") pod \"certified-operators-8cdxj\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.045062 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:36:58 crc kubenswrapper[4626]: I0223 08:36:58.988880 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cdxj"] Feb 23 08:36:59 crc kubenswrapper[4626]: W0223 08:36:59.002307 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3dcd05c_5940_4e0c_ae21_7a1d9503875f.slice/crio-edc79b33bafe6a0d91db354482b03dd7e312933a85ace8d6eb19fb2bc786aa67 WatchSource:0}: Error finding container edc79b33bafe6a0d91db354482b03dd7e312933a85ace8d6eb19fb2bc786aa67: Status 404 returned error can't find the container with id edc79b33bafe6a0d91db354482b03dd7e312933a85ace8d6eb19fb2bc786aa67 Feb 23 08:36:59 crc kubenswrapper[4626]: I0223 08:36:59.518167 4626 generic.go:334] "Generic (PLEG): container finished" podID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerID="192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069" exitCode=0 Feb 23 08:36:59 crc kubenswrapper[4626]: I0223 08:36:59.518274 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerDied","Data":"192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069"} Feb 23 08:36:59 crc kubenswrapper[4626]: I0223 08:36:59.518550 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerStarted","Data":"edc79b33bafe6a0d91db354482b03dd7e312933a85ace8d6eb19fb2bc786aa67"} Feb 23 08:36:59 crc kubenswrapper[4626]: I0223 08:36:59.521345 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:37:00 crc kubenswrapper[4626]: I0223 08:37:00.531950 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerStarted","Data":"392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751"} Feb 23 08:37:01 crc kubenswrapper[4626]: I0223 08:37:01.541083 4626 generic.go:334] "Generic (PLEG): container finished" podID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerID="392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751" exitCode=0 Feb 23 08:37:01 crc kubenswrapper[4626]: I0223 08:37:01.541143 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerDied","Data":"392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751"} Feb 23 08:37:02 crc kubenswrapper[4626]: I0223 08:37:02.557210 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerStarted","Data":"d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7"} Feb 23 08:37:02 crc kubenswrapper[4626]: I0223 08:37:02.581274 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8cdxj" podStartSLOduration=3.074052559 podStartE2EDuration="5.580253461s" podCreationTimestamp="2026-02-23 08:36:57 +0000 UTC" firstStartedPulling="2026-02-23 08:36:59.520209156 +0000 UTC m=+6971.859538422" lastFinishedPulling="2026-02-23 08:37:02.026410058 +0000 UTC m=+6974.365739324" observedRunningTime="2026-02-23 08:37:02.575928796 +0000 UTC m=+6974.915258062" watchObservedRunningTime="2026-02-23 08:37:02.580253461 +0000 UTC m=+6974.919582727" Feb 23 08:37:08 crc kubenswrapper[4626]: I0223 08:37:08.045523 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:37:08 crc kubenswrapper[4626]: I0223 08:37:08.046877 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:37:08 crc kubenswrapper[4626]: I0223 08:37:08.089399 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:37:08 crc kubenswrapper[4626]: I0223 08:37:08.647938 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:37:08 crc kubenswrapper[4626]: I0223 08:37:08.693824 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cdxj"] Feb 23 08:37:10 crc kubenswrapper[4626]: I0223 08:37:10.630612 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8cdxj" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="registry-server" containerID="cri-o://d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7" gracePeriod=2 Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.153822 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.221727 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-utilities\") pod \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.221910 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-catalog-content\") pod \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.222148 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wph5h\" (UniqueName: \"kubernetes.io/projected/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-kube-api-access-wph5h\") pod \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\" (UID: \"a3dcd05c-5940-4e0c-ae21-7a1d9503875f\") " Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.222210 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-utilities" (OuterVolumeSpecName: "utilities") pod "a3dcd05c-5940-4e0c-ae21-7a1d9503875f" (UID: "a3dcd05c-5940-4e0c-ae21-7a1d9503875f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.222887 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.231626 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-kube-api-access-wph5h" (OuterVolumeSpecName: "kube-api-access-wph5h") pod "a3dcd05c-5940-4e0c-ae21-7a1d9503875f" (UID: "a3dcd05c-5940-4e0c-ae21-7a1d9503875f"). InnerVolumeSpecName "kube-api-access-wph5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.270138 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3dcd05c-5940-4e0c-ae21-7a1d9503875f" (UID: "a3dcd05c-5940-4e0c-ae21-7a1d9503875f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.325967 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.326005 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wph5h\" (UniqueName: \"kubernetes.io/projected/a3dcd05c-5940-4e0c-ae21-7a1d9503875f-kube-api-access-wph5h\") on node \"crc\" DevicePath \"\"" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.648409 4626 generic.go:334] "Generic (PLEG): container finished" podID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerID="d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7" exitCode=0 Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.648483 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerDied","Data":"d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7"} Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.648551 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cdxj" event={"ID":"a3dcd05c-5940-4e0c-ae21-7a1d9503875f","Type":"ContainerDied","Data":"edc79b33bafe6a0d91db354482b03dd7e312933a85ace8d6eb19fb2bc786aa67"} Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.648593 4626 scope.go:117] "RemoveContainer" containerID="d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.648862 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cdxj" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.677230 4626 scope.go:117] "RemoveContainer" containerID="392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.705268 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cdxj"] Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.709001 4626 scope.go:117] "RemoveContainer" containerID="192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.718999 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8cdxj"] Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.740628 4626 scope.go:117] "RemoveContainer" containerID="d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7" Feb 23 08:37:11 crc kubenswrapper[4626]: E0223 08:37:11.741929 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7\": container with ID starting with d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7 not found: ID does not exist" containerID="d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.742045 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7"} err="failed to get container status \"d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7\": rpc error: code = NotFound desc = could not find container \"d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7\": container with ID starting with d723a69a5feaf7631842d25e2529837efdb5cb07d9d124136d8c8ec6e55af0f7 not found: ID does not exist" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.742135 4626 scope.go:117] "RemoveContainer" containerID="392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751" Feb 23 08:37:11 crc kubenswrapper[4626]: E0223 08:37:11.742529 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751\": container with ID starting with 392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751 not found: ID does not exist" containerID="392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.742625 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751"} err="failed to get container status \"392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751\": rpc error: code = NotFound desc = could not find container \"392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751\": container with ID starting with 392abb1c768122dad05cd7e639a572fa321d870723ff856ee9528f84aa5dc751 not found: ID does not exist" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.742713 4626 scope.go:117] "RemoveContainer" containerID="192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069" Feb 23 08:37:11 crc kubenswrapper[4626]: E0223 08:37:11.743047 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069\": container with ID starting with 192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069 not found: ID does not exist" containerID="192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.743091 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069"} err="failed to get container status \"192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069\": rpc error: code = NotFound desc = could not find container \"192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069\": container with ID starting with 192ff08f7c0a95604b3eeb4241011b4a10a9a6d40f9f4e923f8166dd2e4e2069 not found: ID does not exist" Feb 23 08:37:11 crc kubenswrapper[4626]: I0223 08:37:11.992989 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" path="/var/lib/kubelet/pods/a3dcd05c-5940-4e0c-ae21-7a1d9503875f/volumes" Feb 23 08:37:18 crc kubenswrapper[4626]: E0223 08:37:18.806941 4626 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.58:41368->192.168.26.58:46805: write tcp 192.168.26.58:41368->192.168.26.58:46805: write: connection reset by peer Feb 23 08:37:25 crc kubenswrapper[4626]: I0223 08:37:25.685619 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:37:25 crc kubenswrapper[4626]: I0223 08:37:25.687219 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:37:55 crc kubenswrapper[4626]: I0223 08:37:55.685077 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:37:55 crc kubenswrapper[4626]: I0223 08:37:55.685788 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:37:55 crc kubenswrapper[4626]: I0223 08:37:55.685857 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:37:55 crc kubenswrapper[4626]: I0223 08:37:55.686869 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd3e113e85afe4fed581bf2049c4e93279c124a62a6ae1b2c772b0831e838560"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:37:55 crc kubenswrapper[4626]: I0223 08:37:55.686939 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://cd3e113e85afe4fed581bf2049c4e93279c124a62a6ae1b2c772b0831e838560" gracePeriod=600 Feb 23 08:37:56 crc kubenswrapper[4626]: I0223 08:37:56.059875 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="cd3e113e85afe4fed581bf2049c4e93279c124a62a6ae1b2c772b0831e838560" exitCode=0 Feb 23 08:37:56 crc kubenswrapper[4626]: I0223 08:37:56.060069 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"cd3e113e85afe4fed581bf2049c4e93279c124a62a6ae1b2c772b0831e838560"} Feb 23 08:37:56 crc kubenswrapper[4626]: I0223 08:37:56.060207 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3"} Feb 23 08:37:56 crc kubenswrapper[4626]: I0223 08:37:56.060229 4626 scope.go:117] "RemoveContainer" containerID="3f1149d76a3f96d7105cc0671f833f4d753d1e029268ab40ada03287a2e90c92" Feb 23 08:39:55 crc kubenswrapper[4626]: I0223 08:39:55.685072 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:39:55 crc kubenswrapper[4626]: I0223 08:39:55.685826 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:40:25 crc kubenswrapper[4626]: I0223 08:40:25.685459 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:40:25 crc kubenswrapper[4626]: I0223 08:40:25.686202 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:40:55 crc kubenswrapper[4626]: I0223 08:40:55.685556 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:40:55 crc kubenswrapper[4626]: I0223 08:40:55.686208 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:40:55 crc kubenswrapper[4626]: I0223 08:40:55.686270 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:40:55 crc kubenswrapper[4626]: I0223 08:40:55.686892 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:40:55 crc kubenswrapper[4626]: I0223 08:40:55.686966 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" gracePeriod=600 Feb 23 08:40:55 crc kubenswrapper[4626]: E0223 08:40:55.822442 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:40:56 crc kubenswrapper[4626]: I0223 08:40:56.788998 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" exitCode=0 Feb 23 08:40:56 crc kubenswrapper[4626]: I0223 08:40:56.789051 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3"} Feb 23 08:40:56 crc kubenswrapper[4626]: I0223 08:40:56.789096 4626 scope.go:117] "RemoveContainer" containerID="cd3e113e85afe4fed581bf2049c4e93279c124a62a6ae1b2c772b0831e838560" Feb 23 08:40:56 crc kubenswrapper[4626]: I0223 08:40:56.789971 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:40:56 crc kubenswrapper[4626]: E0223 08:40:56.790512 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:41:09 crc kubenswrapper[4626]: I0223 08:41:09.982766 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:41:09 crc kubenswrapper[4626]: E0223 08:41:09.983572 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:41:20 crc kubenswrapper[4626]: I0223 08:41:20.983196 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:41:20 crc kubenswrapper[4626]: E0223 08:41:20.984725 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:41:35 crc kubenswrapper[4626]: I0223 08:41:35.984339 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:41:35 crc kubenswrapper[4626]: E0223 08:41:35.987369 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:41:49 crc kubenswrapper[4626]: I0223 08:41:49.981799 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:41:49 crc kubenswrapper[4626]: E0223 08:41:49.982946 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.331941 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wv9m5"] Feb 23 08:41:58 crc kubenswrapper[4626]: E0223 08:41:58.333244 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="extract-content" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.333266 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="extract-content" Feb 23 08:41:58 crc kubenswrapper[4626]: E0223 08:41:58.333294 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="extract-utilities" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.333302 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="extract-utilities" Feb 23 08:41:58 crc kubenswrapper[4626]: E0223 08:41:58.333328 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="registry-server" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.333335 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="registry-server" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.333644 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dcd05c-5940-4e0c-ae21-7a1d9503875f" containerName="registry-server" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.335210 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.349394 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv9m5"] Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.492395 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-utilities\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.492482 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqk9d\" (UniqueName: \"kubernetes.io/projected/80f1678f-65fc-4ae7-b3ca-28cd0d254833-kube-api-access-zqk9d\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.492904 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-catalog-content\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.595079 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-catalog-content\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.595191 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-utilities\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.595248 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqk9d\" (UniqueName: \"kubernetes.io/projected/80f1678f-65fc-4ae7-b3ca-28cd0d254833-kube-api-access-zqk9d\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.595644 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-catalog-content\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.595990 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-utilities\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.620456 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqk9d\" (UniqueName: \"kubernetes.io/projected/80f1678f-65fc-4ae7-b3ca-28cd0d254833-kube-api-access-zqk9d\") pod \"redhat-marketplace-wv9m5\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:58 crc kubenswrapper[4626]: I0223 08:41:58.665562 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:41:59 crc kubenswrapper[4626]: I0223 08:41:59.395222 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv9m5"] Feb 23 08:41:59 crc kubenswrapper[4626]: I0223 08:41:59.454788 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerStarted","Data":"42be786377af6e48e5642d33fb8dcbc8ea875af8510275ba81accaf4ac1d70a7"} Feb 23 08:42:00 crc kubenswrapper[4626]: I0223 08:42:00.468417 4626 generic.go:334] "Generic (PLEG): container finished" podID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerID="ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16" exitCode=0 Feb 23 08:42:00 crc kubenswrapper[4626]: I0223 08:42:00.468485 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerDied","Data":"ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16"} Feb 23 08:42:00 crc kubenswrapper[4626]: I0223 08:42:00.474738 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:42:01 crc kubenswrapper[4626]: I0223 08:42:01.480574 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerStarted","Data":"15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29"} Feb 23 08:42:02 crc kubenswrapper[4626]: I0223 08:42:02.492289 4626 generic.go:334] "Generic (PLEG): container finished" podID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerID="15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29" exitCode=0 Feb 23 08:42:02 crc kubenswrapper[4626]: I0223 08:42:02.492353 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerDied","Data":"15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29"} Feb 23 08:42:03 crc kubenswrapper[4626]: I0223 08:42:03.507838 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerStarted","Data":"1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395"} Feb 23 08:42:03 crc kubenswrapper[4626]: I0223 08:42:03.526713 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wv9m5" podStartSLOduration=3.040550387 podStartE2EDuration="5.526697471s" podCreationTimestamp="2026-02-23 08:41:58 +0000 UTC" firstStartedPulling="2026-02-23 08:42:00.474477993 +0000 UTC m=+7272.813807259" lastFinishedPulling="2026-02-23 08:42:02.960625077 +0000 UTC m=+7275.299954343" observedRunningTime="2026-02-23 08:42:03.523044414 +0000 UTC m=+7275.862373680" watchObservedRunningTime="2026-02-23 08:42:03.526697471 +0000 UTC m=+7275.866026737" Feb 23 08:42:03 crc kubenswrapper[4626]: I0223 08:42:03.982019 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:42:03 crc kubenswrapper[4626]: E0223 08:42:03.982588 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:42:08 crc kubenswrapper[4626]: I0223 08:42:08.667432 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:42:08 crc kubenswrapper[4626]: I0223 08:42:08.667821 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:42:08 crc kubenswrapper[4626]: I0223 08:42:08.714648 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:42:09 crc kubenswrapper[4626]: I0223 08:42:09.611950 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:42:09 crc kubenswrapper[4626]: I0223 08:42:09.658096 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv9m5"] Feb 23 08:42:11 crc kubenswrapper[4626]: I0223 08:42:11.587471 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wv9m5" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="registry-server" containerID="cri-o://1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395" gracePeriod=2 Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.083490 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.232311 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqk9d\" (UniqueName: \"kubernetes.io/projected/80f1678f-65fc-4ae7-b3ca-28cd0d254833-kube-api-access-zqk9d\") pod \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.232684 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-catalog-content\") pod \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.232948 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-utilities\") pod \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\" (UID: \"80f1678f-65fc-4ae7-b3ca-28cd0d254833\") " Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.233606 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-utilities" (OuterVolumeSpecName: "utilities") pod "80f1678f-65fc-4ae7-b3ca-28cd0d254833" (UID: "80f1678f-65fc-4ae7-b3ca-28cd0d254833"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.233856 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.242777 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f1678f-65fc-4ae7-b3ca-28cd0d254833-kube-api-access-zqk9d" (OuterVolumeSpecName: "kube-api-access-zqk9d") pod "80f1678f-65fc-4ae7-b3ca-28cd0d254833" (UID: "80f1678f-65fc-4ae7-b3ca-28cd0d254833"). InnerVolumeSpecName "kube-api-access-zqk9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.253141 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80f1678f-65fc-4ae7-b3ca-28cd0d254833" (UID: "80f1678f-65fc-4ae7-b3ca-28cd0d254833"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.336749 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80f1678f-65fc-4ae7-b3ca-28cd0d254833-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.336819 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqk9d\" (UniqueName: \"kubernetes.io/projected/80f1678f-65fc-4ae7-b3ca-28cd0d254833-kube-api-access-zqk9d\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.600950 4626 generic.go:334] "Generic (PLEG): container finished" podID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerID="1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395" exitCode=0 Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.601033 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv9m5" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.601059 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerDied","Data":"1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395"} Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.602434 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv9m5" event={"ID":"80f1678f-65fc-4ae7-b3ca-28cd0d254833","Type":"ContainerDied","Data":"42be786377af6e48e5642d33fb8dcbc8ea875af8510275ba81accaf4ac1d70a7"} Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.602460 4626 scope.go:117] "RemoveContainer" containerID="1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.643270 4626 scope.go:117] "RemoveContainer" containerID="15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.657317 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv9m5"] Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.665189 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv9m5"] Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.678233 4626 scope.go:117] "RemoveContainer" containerID="ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.717346 4626 scope.go:117] "RemoveContainer" containerID="1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395" Feb 23 08:42:12 crc kubenswrapper[4626]: E0223 08:42:12.717866 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395\": container with ID starting with 1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395 not found: ID does not exist" containerID="1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.717911 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395"} err="failed to get container status \"1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395\": rpc error: code = NotFound desc = could not find container \"1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395\": container with ID starting with 1f87414eccc64d209ea94663b1af2546b20827b24b4d0c4e3ca2b008c034e395 not found: ID does not exist" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.717941 4626 scope.go:117] "RemoveContainer" containerID="15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29" Feb 23 08:42:12 crc kubenswrapper[4626]: E0223 08:42:12.719953 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29\": container with ID starting with 15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29 not found: ID does not exist" containerID="15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.720007 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29"} err="failed to get container status \"15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29\": rpc error: code = NotFound desc = could not find container \"15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29\": container with ID starting with 15e5afed402ea7932a71e093e29659f822888bfee5abe5e6e40010a77e000f29 not found: ID does not exist" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.720045 4626 scope.go:117] "RemoveContainer" containerID="ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16" Feb 23 08:42:12 crc kubenswrapper[4626]: E0223 08:42:12.720398 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16\": container with ID starting with ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16 not found: ID does not exist" containerID="ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16" Feb 23 08:42:12 crc kubenswrapper[4626]: I0223 08:42:12.720433 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16"} err="failed to get container status \"ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16\": rpc error: code = NotFound desc = could not find container \"ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16\": container with ID starting with ae643c69e60dfe6566f1b8a82bfc00eb8e2bb908cc71cb0b8d70a9335c12cb16 not found: ID does not exist" Feb 23 08:42:13 crc kubenswrapper[4626]: I0223 08:42:13.995688 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" path="/var/lib/kubelet/pods/80f1678f-65fc-4ae7-b3ca-28cd0d254833/volumes" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.940126 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvcrh"] Feb 23 08:42:14 crc kubenswrapper[4626]: E0223 08:42:14.940577 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="extract-utilities" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.940601 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="extract-utilities" Feb 23 08:42:14 crc kubenswrapper[4626]: E0223 08:42:14.940626 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="registry-server" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.940633 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="registry-server" Feb 23 08:42:14 crc kubenswrapper[4626]: E0223 08:42:14.940681 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="extract-content" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.940688 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="extract-content" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.940921 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f1678f-65fc-4ae7-b3ca-28cd0d254833" containerName="registry-server" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.942912 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.954518 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvcrh"] Feb 23 08:42:14 crc kubenswrapper[4626]: I0223 08:42:14.999471 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-catalog-content\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:14.999887 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6xll\" (UniqueName: \"kubernetes.io/projected/cc6391b3-65a8-48b5-beb9-5e7e282b5890-kube-api-access-w6xll\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.000178 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-utilities\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.101816 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-utilities\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.101985 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-catalog-content\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.102048 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6xll\" (UniqueName: \"kubernetes.io/projected/cc6391b3-65a8-48b5-beb9-5e7e282b5890-kube-api-access-w6xll\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.102320 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-utilities\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.102375 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-catalog-content\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.119742 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6xll\" (UniqueName: \"kubernetes.io/projected/cc6391b3-65a8-48b5-beb9-5e7e282b5890-kube-api-access-w6xll\") pod \"redhat-operators-wvcrh\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.259459 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:15 crc kubenswrapper[4626]: I0223 08:42:15.742316 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvcrh"] Feb 23 08:42:16 crc kubenswrapper[4626]: I0223 08:42:16.650972 4626 generic.go:334] "Generic (PLEG): container finished" podID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerID="e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a" exitCode=0 Feb 23 08:42:16 crc kubenswrapper[4626]: I0223 08:42:16.651120 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerDied","Data":"e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a"} Feb 23 08:42:16 crc kubenswrapper[4626]: I0223 08:42:16.651404 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerStarted","Data":"07845b04402a9ce405058db2403d6de0bbfdff6fb6e6ff55f31a417ea8b33c19"} Feb 23 08:42:17 crc kubenswrapper[4626]: I0223 08:42:17.662304 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerStarted","Data":"a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172"} Feb 23 08:42:17 crc kubenswrapper[4626]: I0223 08:42:17.988265 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:42:17 crc kubenswrapper[4626]: E0223 08:42:17.988624 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:42:20 crc kubenswrapper[4626]: I0223 08:42:20.689022 4626 generic.go:334] "Generic (PLEG): container finished" podID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerID="a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172" exitCode=0 Feb 23 08:42:20 crc kubenswrapper[4626]: I0223 08:42:20.689104 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerDied","Data":"a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172"} Feb 23 08:42:21 crc kubenswrapper[4626]: I0223 08:42:21.714738 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerStarted","Data":"af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299"} Feb 23 08:42:21 crc kubenswrapper[4626]: I0223 08:42:21.744977 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvcrh" podStartSLOduration=3.241429815 podStartE2EDuration="7.744956372s" podCreationTimestamp="2026-02-23 08:42:14 +0000 UTC" firstStartedPulling="2026-02-23 08:42:16.653627092 +0000 UTC m=+7288.992956358" lastFinishedPulling="2026-02-23 08:42:21.15715365 +0000 UTC m=+7293.496482915" observedRunningTime="2026-02-23 08:42:21.735578663 +0000 UTC m=+7294.074907929" watchObservedRunningTime="2026-02-23 08:42:21.744956372 +0000 UTC m=+7294.084285629" Feb 23 08:42:25 crc kubenswrapper[4626]: I0223 08:42:25.259706 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:25 crc kubenswrapper[4626]: I0223 08:42:25.260552 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:26 crc kubenswrapper[4626]: I0223 08:42:26.299568 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvcrh" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="registry-server" probeResult="failure" output=< Feb 23 08:42:26 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:42:26 crc kubenswrapper[4626]: > Feb 23 08:42:30 crc kubenswrapper[4626]: I0223 08:42:30.982197 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:42:30 crc kubenswrapper[4626]: E0223 08:42:30.983192 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:42:35 crc kubenswrapper[4626]: I0223 08:42:35.299152 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:35 crc kubenswrapper[4626]: I0223 08:42:35.339811 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:35 crc kubenswrapper[4626]: I0223 08:42:35.533887 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvcrh"] Feb 23 08:42:36 crc kubenswrapper[4626]: I0223 08:42:36.871897 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvcrh" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="registry-server" containerID="cri-o://af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299" gracePeriod=2 Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.370762 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.441492 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6xll\" (UniqueName: \"kubernetes.io/projected/cc6391b3-65a8-48b5-beb9-5e7e282b5890-kube-api-access-w6xll\") pod \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.441566 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-utilities\") pod \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.441716 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-catalog-content\") pod \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\" (UID: \"cc6391b3-65a8-48b5-beb9-5e7e282b5890\") " Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.442155 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-utilities" (OuterVolumeSpecName: "utilities") pod "cc6391b3-65a8-48b5-beb9-5e7e282b5890" (UID: "cc6391b3-65a8-48b5-beb9-5e7e282b5890"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.442470 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.450613 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6391b3-65a8-48b5-beb9-5e7e282b5890-kube-api-access-w6xll" (OuterVolumeSpecName: "kube-api-access-w6xll") pod "cc6391b3-65a8-48b5-beb9-5e7e282b5890" (UID: "cc6391b3-65a8-48b5-beb9-5e7e282b5890"). InnerVolumeSpecName "kube-api-access-w6xll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.545159 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6xll\" (UniqueName: \"kubernetes.io/projected/cc6391b3-65a8-48b5-beb9-5e7e282b5890-kube-api-access-w6xll\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.551113 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc6391b3-65a8-48b5-beb9-5e7e282b5890" (UID: "cc6391b3-65a8-48b5-beb9-5e7e282b5890"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.647387 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc6391b3-65a8-48b5-beb9-5e7e282b5890-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.884708 4626 generic.go:334] "Generic (PLEG): container finished" podID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerID="af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299" exitCode=0 Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.884765 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerDied","Data":"af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299"} Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.884779 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvcrh" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.884794 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvcrh" event={"ID":"cc6391b3-65a8-48b5-beb9-5e7e282b5890","Type":"ContainerDied","Data":"07845b04402a9ce405058db2403d6de0bbfdff6fb6e6ff55f31a417ea8b33c19"} Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.884816 4626 scope.go:117] "RemoveContainer" containerID="af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.913629 4626 scope.go:117] "RemoveContainer" containerID="a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.922977 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvcrh"] Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.931434 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvcrh"] Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.934313 4626 scope.go:117] "RemoveContainer" containerID="e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.974355 4626 scope.go:117] "RemoveContainer" containerID="af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299" Feb 23 08:42:37 crc kubenswrapper[4626]: E0223 08:42:37.975058 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299\": container with ID starting with af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299 not found: ID does not exist" containerID="af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.975102 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299"} err="failed to get container status \"af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299\": rpc error: code = NotFound desc = could not find container \"af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299\": container with ID starting with af085bd4c4f02189913d0662b66cd14aafe232e8a1548a9ec0bb0b351f318299 not found: ID does not exist" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.975130 4626 scope.go:117] "RemoveContainer" containerID="a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172" Feb 23 08:42:37 crc kubenswrapper[4626]: E0223 08:42:37.975563 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172\": container with ID starting with a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172 not found: ID does not exist" containerID="a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.975599 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172"} err="failed to get container status \"a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172\": rpc error: code = NotFound desc = could not find container \"a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172\": container with ID starting with a23722d597a2fb6bfceeab932715bea910012e04c732da5460d89a06a5600172 not found: ID does not exist" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.975623 4626 scope.go:117] "RemoveContainer" containerID="e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a" Feb 23 08:42:37 crc kubenswrapper[4626]: E0223 08:42:37.976389 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a\": container with ID starting with e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a not found: ID does not exist" containerID="e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.976417 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a"} err="failed to get container status \"e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a\": rpc error: code = NotFound desc = could not find container \"e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a\": container with ID starting with e6ec1f6807edc9b6e2421e092ef261e170b2c49c9e5ab5c760ac1ce14ff8d10a not found: ID does not exist" Feb 23 08:42:37 crc kubenswrapper[4626]: I0223 08:42:37.996464 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" path="/var/lib/kubelet/pods/cc6391b3-65a8-48b5-beb9-5e7e282b5890/volumes" Feb 23 08:42:44 crc kubenswrapper[4626]: I0223 08:42:44.982942 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:42:44 crc kubenswrapper[4626]: E0223 08:42:44.983778 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:42:56 crc kubenswrapper[4626]: I0223 08:42:56.982378 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:42:56 crc kubenswrapper[4626]: E0223 08:42:56.982953 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:43:10 crc kubenswrapper[4626]: I0223 08:43:10.981651 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:43:10 crc kubenswrapper[4626]: E0223 08:43:10.982524 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:43:23 crc kubenswrapper[4626]: I0223 08:43:23.983430 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:43:23 crc kubenswrapper[4626]: E0223 08:43:23.984648 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:43:36 crc kubenswrapper[4626]: I0223 08:43:36.987982 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:43:36 crc kubenswrapper[4626]: E0223 08:43:36.992809 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:43:48 crc kubenswrapper[4626]: I0223 08:43:48.982143 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:43:48 crc kubenswrapper[4626]: E0223 08:43:48.983171 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:44:00 crc kubenswrapper[4626]: I0223 08:44:00.982464 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:44:00 crc kubenswrapper[4626]: E0223 08:44:00.983421 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:44:13 crc kubenswrapper[4626]: I0223 08:44:13.982913 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:44:13 crc kubenswrapper[4626]: E0223 08:44:13.983784 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:44:27 crc kubenswrapper[4626]: I0223 08:44:27.988273 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:44:27 crc kubenswrapper[4626]: E0223 08:44:27.989094 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:44:40 crc kubenswrapper[4626]: I0223 08:44:40.981967 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:44:40 crc kubenswrapper[4626]: E0223 08:44:40.982806 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:44:54 crc kubenswrapper[4626]: I0223 08:44:54.982180 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:44:54 crc kubenswrapper[4626]: E0223 08:44:54.983172 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.198182 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh"] Feb 23 08:45:00 crc kubenswrapper[4626]: E0223 08:45:00.199540 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="extract-content" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.199567 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="extract-content" Feb 23 08:45:00 crc kubenswrapper[4626]: E0223 08:45:00.199623 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="registry-server" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.199629 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="registry-server" Feb 23 08:45:00 crc kubenswrapper[4626]: E0223 08:45:00.199657 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="extract-utilities" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.199666 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="extract-utilities" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.200176 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6391b3-65a8-48b5-beb9-5e7e282b5890" containerName="registry-server" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.201337 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.220737 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.220848 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.223745 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6897aa9-e088-451a-99bb-d45572959840-config-volume\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.224109 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5j86\" (UniqueName: \"kubernetes.io/projected/f6897aa9-e088-451a-99bb-d45572959840-kube-api-access-j5j86\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.224237 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6897aa9-e088-451a-99bb-d45572959840-secret-volume\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.297551 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh"] Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.325825 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6897aa9-e088-451a-99bb-d45572959840-config-volume\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.325972 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5j86\" (UniqueName: \"kubernetes.io/projected/f6897aa9-e088-451a-99bb-d45572959840-kube-api-access-j5j86\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.326035 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6897aa9-e088-451a-99bb-d45572959840-secret-volume\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.327636 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6897aa9-e088-451a-99bb-d45572959840-config-volume\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.342124 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6897aa9-e088-451a-99bb-d45572959840-secret-volume\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.342184 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5j86\" (UniqueName: \"kubernetes.io/projected/f6897aa9-e088-451a-99bb-d45572959840-kube-api-access-j5j86\") pod \"collect-profiles-29530605-hddgh\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:00 crc kubenswrapper[4626]: I0223 08:45:00.535461 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:01 crc kubenswrapper[4626]: I0223 08:45:01.019193 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh"] Feb 23 08:45:01 crc kubenswrapper[4626]: I0223 08:45:01.275730 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" event={"ID":"f6897aa9-e088-451a-99bb-d45572959840","Type":"ContainerStarted","Data":"d4ddd2b4f049a13ba2d4b801edbf3a0f706cf5ba2678d8210047de1153a9b840"} Feb 23 08:45:01 crc kubenswrapper[4626]: I0223 08:45:01.275793 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" event={"ID":"f6897aa9-e088-451a-99bb-d45572959840","Type":"ContainerStarted","Data":"39cfca65769cda9b9476485aafdaeee38a50d9512679a81161c8053f4dbab59f"} Feb 23 08:45:01 crc kubenswrapper[4626]: I0223 08:45:01.294548 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" podStartSLOduration=1.294536439 podStartE2EDuration="1.294536439s" podCreationTimestamp="2026-02-23 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:01.287092466 +0000 UTC m=+7453.626421731" watchObservedRunningTime="2026-02-23 08:45:01.294536439 +0000 UTC m=+7453.633865706" Feb 23 08:45:02 crc kubenswrapper[4626]: I0223 08:45:02.285111 4626 generic.go:334] "Generic (PLEG): container finished" podID="f6897aa9-e088-451a-99bb-d45572959840" containerID="d4ddd2b4f049a13ba2d4b801edbf3a0f706cf5ba2678d8210047de1153a9b840" exitCode=0 Feb 23 08:45:02 crc kubenswrapper[4626]: I0223 08:45:02.285214 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" event={"ID":"f6897aa9-e088-451a-99bb-d45572959840","Type":"ContainerDied","Data":"d4ddd2b4f049a13ba2d4b801edbf3a0f706cf5ba2678d8210047de1153a9b840"} Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.594026 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.727329 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6897aa9-e088-451a-99bb-d45572959840-config-volume\") pod \"f6897aa9-e088-451a-99bb-d45572959840\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.727466 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5j86\" (UniqueName: \"kubernetes.io/projected/f6897aa9-e088-451a-99bb-d45572959840-kube-api-access-j5j86\") pod \"f6897aa9-e088-451a-99bb-d45572959840\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.727526 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6897aa9-e088-451a-99bb-d45572959840-secret-volume\") pod \"f6897aa9-e088-451a-99bb-d45572959840\" (UID: \"f6897aa9-e088-451a-99bb-d45572959840\") " Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.728959 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6897aa9-e088-451a-99bb-d45572959840-config-volume" (OuterVolumeSpecName: "config-volume") pod "f6897aa9-e088-451a-99bb-d45572959840" (UID: "f6897aa9-e088-451a-99bb-d45572959840"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.733611 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6897aa9-e088-451a-99bb-d45572959840-kube-api-access-j5j86" (OuterVolumeSpecName: "kube-api-access-j5j86") pod "f6897aa9-e088-451a-99bb-d45572959840" (UID: "f6897aa9-e088-451a-99bb-d45572959840"). InnerVolumeSpecName "kube-api-access-j5j86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.760386 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6897aa9-e088-451a-99bb-d45572959840-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f6897aa9-e088-451a-99bb-d45572959840" (UID: "f6897aa9-e088-451a-99bb-d45572959840"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.829752 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5j86\" (UniqueName: \"kubernetes.io/projected/f6897aa9-e088-451a-99bb-d45572959840-kube-api-access-j5j86\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.829783 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f6897aa9-e088-451a-99bb-d45572959840-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[4626]: I0223 08:45:03.829794 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6897aa9-e088-451a-99bb-d45572959840-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:04 crc kubenswrapper[4626]: I0223 08:45:04.307196 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" event={"ID":"f6897aa9-e088-451a-99bb-d45572959840","Type":"ContainerDied","Data":"39cfca65769cda9b9476485aafdaeee38a50d9512679a81161c8053f4dbab59f"} Feb 23 08:45:04 crc kubenswrapper[4626]: I0223 08:45:04.307250 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39cfca65769cda9b9476485aafdaeee38a50d9512679a81161c8053f4dbab59f" Feb 23 08:45:04 crc kubenswrapper[4626]: I0223 08:45:04.307677 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh" Feb 23 08:45:04 crc kubenswrapper[4626]: I0223 08:45:04.417624 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv"] Feb 23 08:45:04 crc kubenswrapper[4626]: I0223 08:45:04.427789 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-pf2kv"] Feb 23 08:45:05 crc kubenswrapper[4626]: I0223 08:45:05.993284 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac33a59-edb4-492c-a2fa-df4a119ade7b" path="/var/lib/kubelet/pods/1ac33a59-edb4-492c-a2fa-df4a119ade7b/volumes" Feb 23 08:45:08 crc kubenswrapper[4626]: I0223 08:45:08.982886 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:45:08 crc kubenswrapper[4626]: E0223 08:45:08.983614 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:45:19 crc kubenswrapper[4626]: I0223 08:45:19.983815 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:45:19 crc kubenswrapper[4626]: E0223 08:45:19.984997 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:45:24 crc kubenswrapper[4626]: I0223 08:45:24.183017 4626 scope.go:117] "RemoveContainer" containerID="50ffb83eace4b93db617b7fe5c2b16a1920fe316f2454b161c00738f95cf06f7" Feb 23 08:45:31 crc kubenswrapper[4626]: I0223 08:45:30.984106 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:45:31 crc kubenswrapper[4626]: E0223 08:45:30.984982 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:45:45 crc kubenswrapper[4626]: I0223 08:45:45.982565 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:45:45 crc kubenswrapper[4626]: E0223 08:45:45.983571 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:45:59 crc kubenswrapper[4626]: I0223 08:45:59.986639 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:46:00 crc kubenswrapper[4626]: I0223 08:46:00.880998 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"b1c990c11eb054dd1040095f44f8c1b83025586bd94666ac10a73d029fd77a3f"} Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.776282 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zp7fg"] Feb 23 08:46:16 crc kubenswrapper[4626]: E0223 08:46:16.777731 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6897aa9-e088-451a-99bb-d45572959840" containerName="collect-profiles" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.777758 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6897aa9-e088-451a-99bb-d45572959840" containerName="collect-profiles" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.778075 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6897aa9-e088-451a-99bb-d45572959840" containerName="collect-profiles" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.780946 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.802714 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zp7fg"] Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.867640 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-utilities\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.867736 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdw5l\" (UniqueName: \"kubernetes.io/projected/15736282-854d-4a08-8d96-3f8b35a241f5-kube-api-access-jdw5l\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.867971 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-catalog-content\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.969899 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdw5l\" (UniqueName: \"kubernetes.io/projected/15736282-854d-4a08-8d96-3f8b35a241f5-kube-api-access-jdw5l\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.969970 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-catalog-content\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.970071 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-utilities\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.970525 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-utilities\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.970599 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-catalog-content\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:16 crc kubenswrapper[4626]: I0223 08:46:16.988163 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdw5l\" (UniqueName: \"kubernetes.io/projected/15736282-854d-4a08-8d96-3f8b35a241f5-kube-api-access-jdw5l\") pod \"community-operators-zp7fg\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:17 crc kubenswrapper[4626]: I0223 08:46:17.102955 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:17 crc kubenswrapper[4626]: I0223 08:46:17.680049 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zp7fg"] Feb 23 08:46:18 crc kubenswrapper[4626]: I0223 08:46:18.044224 4626 generic.go:334] "Generic (PLEG): container finished" podID="15736282-854d-4a08-8d96-3f8b35a241f5" containerID="606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024" exitCode=0 Feb 23 08:46:18 crc kubenswrapper[4626]: I0223 08:46:18.044279 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerDied","Data":"606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024"} Feb 23 08:46:18 crc kubenswrapper[4626]: I0223 08:46:18.044311 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerStarted","Data":"1c65218682afc1e7baab1a85744a6709a1df3330abb32fdfab00a25ebd92d8ec"} Feb 23 08:46:19 crc kubenswrapper[4626]: I0223 08:46:19.054439 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerStarted","Data":"d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65"} Feb 23 08:46:20 crc kubenswrapper[4626]: I0223 08:46:20.079187 4626 generic.go:334] "Generic (PLEG): container finished" podID="15736282-854d-4a08-8d96-3f8b35a241f5" containerID="d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65" exitCode=0 Feb 23 08:46:20 crc kubenswrapper[4626]: I0223 08:46:20.079308 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerDied","Data":"d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65"} Feb 23 08:46:21 crc kubenswrapper[4626]: I0223 08:46:21.094814 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerStarted","Data":"ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead"} Feb 23 08:46:21 crc kubenswrapper[4626]: I0223 08:46:21.119543 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zp7fg" podStartSLOduration=2.645438485 podStartE2EDuration="5.119518769s" podCreationTimestamp="2026-02-23 08:46:16 +0000 UTC" firstStartedPulling="2026-02-23 08:46:18.046855688 +0000 UTC m=+7530.386184954" lastFinishedPulling="2026-02-23 08:46:20.520935973 +0000 UTC m=+7532.860265238" observedRunningTime="2026-02-23 08:46:21.114021786 +0000 UTC m=+7533.453351052" watchObservedRunningTime="2026-02-23 08:46:21.119518769 +0000 UTC m=+7533.458848035" Feb 23 08:46:27 crc kubenswrapper[4626]: I0223 08:46:27.103697 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:27 crc kubenswrapper[4626]: I0223 08:46:27.104551 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:27 crc kubenswrapper[4626]: I0223 08:46:27.155344 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:27 crc kubenswrapper[4626]: I0223 08:46:27.210717 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:27 crc kubenswrapper[4626]: I0223 08:46:27.393195 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zp7fg"] Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.182118 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zp7fg" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="registry-server" containerID="cri-o://ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead" gracePeriod=2 Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.740316 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.811871 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-utilities\") pod \"15736282-854d-4a08-8d96-3f8b35a241f5\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.812066 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-catalog-content\") pod \"15736282-854d-4a08-8d96-3f8b35a241f5\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.812238 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdw5l\" (UniqueName: \"kubernetes.io/projected/15736282-854d-4a08-8d96-3f8b35a241f5-kube-api-access-jdw5l\") pod \"15736282-854d-4a08-8d96-3f8b35a241f5\" (UID: \"15736282-854d-4a08-8d96-3f8b35a241f5\") " Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.812435 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-utilities" (OuterVolumeSpecName: "utilities") pod "15736282-854d-4a08-8d96-3f8b35a241f5" (UID: "15736282-854d-4a08-8d96-3f8b35a241f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.812835 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.822733 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15736282-854d-4a08-8d96-3f8b35a241f5-kube-api-access-jdw5l" (OuterVolumeSpecName: "kube-api-access-jdw5l") pod "15736282-854d-4a08-8d96-3f8b35a241f5" (UID: "15736282-854d-4a08-8d96-3f8b35a241f5"). InnerVolumeSpecName "kube-api-access-jdw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.852365 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15736282-854d-4a08-8d96-3f8b35a241f5" (UID: "15736282-854d-4a08-8d96-3f8b35a241f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.914987 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdw5l\" (UniqueName: \"kubernetes.io/projected/15736282-854d-4a08-8d96-3f8b35a241f5-kube-api-access-jdw5l\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:29 crc kubenswrapper[4626]: I0223 08:46:29.915030 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15736282-854d-4a08-8d96-3f8b35a241f5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.195780 4626 generic.go:334] "Generic (PLEG): container finished" podID="15736282-854d-4a08-8d96-3f8b35a241f5" containerID="ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead" exitCode=0 Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.195852 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerDied","Data":"ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead"} Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.195866 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zp7fg" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.195915 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zp7fg" event={"ID":"15736282-854d-4a08-8d96-3f8b35a241f5","Type":"ContainerDied","Data":"1c65218682afc1e7baab1a85744a6709a1df3330abb32fdfab00a25ebd92d8ec"} Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.195946 4626 scope.go:117] "RemoveContainer" containerID="ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.225751 4626 scope.go:117] "RemoveContainer" containerID="d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.228378 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zp7fg"] Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.249109 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zp7fg"] Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.252751 4626 scope.go:117] "RemoveContainer" containerID="606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.288773 4626 scope.go:117] "RemoveContainer" containerID="ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead" Feb 23 08:46:30 crc kubenswrapper[4626]: E0223 08:46:30.289278 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead\": container with ID starting with ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead not found: ID does not exist" containerID="ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.289332 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead"} err="failed to get container status \"ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead\": rpc error: code = NotFound desc = could not find container \"ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead\": container with ID starting with ab86cf6afa59ba70cd2fe779c8ffec364a8bbae16603dc9bd18a72e1ffe31ead not found: ID does not exist" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.289372 4626 scope.go:117] "RemoveContainer" containerID="d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65" Feb 23 08:46:30 crc kubenswrapper[4626]: E0223 08:46:30.289697 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65\": container with ID starting with d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65 not found: ID does not exist" containerID="d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.289730 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65"} err="failed to get container status \"d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65\": rpc error: code = NotFound desc = could not find container \"d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65\": container with ID starting with d86eae5d683bc787513f58156ce8834c525d773000b3ca53c1799f29c3329b65 not found: ID does not exist" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.289751 4626 scope.go:117] "RemoveContainer" containerID="606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024" Feb 23 08:46:30 crc kubenswrapper[4626]: E0223 08:46:30.289985 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024\": container with ID starting with 606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024 not found: ID does not exist" containerID="606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024" Feb 23 08:46:30 crc kubenswrapper[4626]: I0223 08:46:30.290008 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024"} err="failed to get container status \"606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024\": rpc error: code = NotFound desc = could not find container \"606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024\": container with ID starting with 606d76ef550284609b3ea50c48395aab476d0209bb01a2d650e396685f745024 not found: ID does not exist" Feb 23 08:46:31 crc kubenswrapper[4626]: I0223 08:46:31.993450 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" path="/var/lib/kubelet/pods/15736282-854d-4a08-8d96-3f8b35a241f5/volumes" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.698235 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bgppl"] Feb 23 08:47:05 crc kubenswrapper[4626]: E0223 08:47:05.699424 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="extract-utilities" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.699443 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="extract-utilities" Feb 23 08:47:05 crc kubenswrapper[4626]: E0223 08:47:05.699469 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="registry-server" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.699476 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="registry-server" Feb 23 08:47:05 crc kubenswrapper[4626]: E0223 08:47:05.699509 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="extract-content" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.699517 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="extract-content" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.699840 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="15736282-854d-4a08-8d96-3f8b35a241f5" containerName="registry-server" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.701480 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.727677 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgppl"] Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.749580 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-449d8\" (UniqueName: \"kubernetes.io/projected/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-kube-api-access-449d8\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.749689 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-catalog-content\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.750073 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-utilities\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.851041 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-utilities\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.851151 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-449d8\" (UniqueName: \"kubernetes.io/projected/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-kube-api-access-449d8\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.851180 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-catalog-content\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.851842 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-utilities\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.852078 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-catalog-content\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:05 crc kubenswrapper[4626]: I0223 08:47:05.871335 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-449d8\" (UniqueName: \"kubernetes.io/projected/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-kube-api-access-449d8\") pod \"certified-operators-bgppl\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:06 crc kubenswrapper[4626]: I0223 08:47:06.020713 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:06 crc kubenswrapper[4626]: I0223 08:47:06.558434 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bgppl"] Feb 23 08:47:07 crc kubenswrapper[4626]: I0223 08:47:07.567183 4626 generic.go:334] "Generic (PLEG): container finished" podID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerID="95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc" exitCode=0 Feb 23 08:47:07 crc kubenswrapper[4626]: I0223 08:47:07.567270 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerDied","Data":"95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc"} Feb 23 08:47:07 crc kubenswrapper[4626]: I0223 08:47:07.567587 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerStarted","Data":"a5e046089f1c9b24d9d3c0a27c25609a364398e06b725a5b0076802f88cd9e25"} Feb 23 08:47:07 crc kubenswrapper[4626]: I0223 08:47:07.572691 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:47:08 crc kubenswrapper[4626]: I0223 08:47:08.592387 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerStarted","Data":"34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb"} Feb 23 08:47:09 crc kubenswrapper[4626]: I0223 08:47:09.607031 4626 generic.go:334] "Generic (PLEG): container finished" podID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerID="34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb" exitCode=0 Feb 23 08:47:09 crc kubenswrapper[4626]: I0223 08:47:09.607366 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerDied","Data":"34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb"} Feb 23 08:47:10 crc kubenswrapper[4626]: I0223 08:47:10.622528 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerStarted","Data":"4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066"} Feb 23 08:47:10 crc kubenswrapper[4626]: I0223 08:47:10.646233 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bgppl" podStartSLOduration=3.025234956 podStartE2EDuration="5.646196153s" podCreationTimestamp="2026-02-23 08:47:05 +0000 UTC" firstStartedPulling="2026-02-23 08:47:07.570866344 +0000 UTC m=+7579.910195609" lastFinishedPulling="2026-02-23 08:47:10.191827539 +0000 UTC m=+7582.531156806" observedRunningTime="2026-02-23 08:47:10.638508679 +0000 UTC m=+7582.977837945" watchObservedRunningTime="2026-02-23 08:47:10.646196153 +0000 UTC m=+7582.985525419" Feb 23 08:47:16 crc kubenswrapper[4626]: I0223 08:47:16.021293 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:16 crc kubenswrapper[4626]: I0223 08:47:16.022000 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:16 crc kubenswrapper[4626]: I0223 08:47:16.068728 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:16 crc kubenswrapper[4626]: I0223 08:47:16.718913 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:17 crc kubenswrapper[4626]: I0223 08:47:17.694638 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgppl"] Feb 23 08:47:18 crc kubenswrapper[4626]: I0223 08:47:18.697571 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bgppl" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="registry-server" containerID="cri-o://4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066" gracePeriod=2 Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.218895 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.320174 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-utilities\") pod \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.320525 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-449d8\" (UniqueName: \"kubernetes.io/projected/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-kube-api-access-449d8\") pod \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.320785 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-catalog-content\") pod \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\" (UID: \"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b\") " Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.321205 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-utilities" (OuterVolumeSpecName: "utilities") pod "b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" (UID: "b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.321674 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.330544 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-kube-api-access-449d8" (OuterVolumeSpecName: "kube-api-access-449d8") pod "b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" (UID: "b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b"). InnerVolumeSpecName "kube-api-access-449d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.372481 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" (UID: "b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.423867 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.423898 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-449d8\" (UniqueName: \"kubernetes.io/projected/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b-kube-api-access-449d8\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.707443 4626 generic.go:334] "Generic (PLEG): container finished" podID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerID="4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066" exitCode=0 Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.707484 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerDied","Data":"4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066"} Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.707533 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bgppl" event={"ID":"b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b","Type":"ContainerDied","Data":"a5e046089f1c9b24d9d3c0a27c25609a364398e06b725a5b0076802f88cd9e25"} Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.707555 4626 scope.go:117] "RemoveContainer" containerID="4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.707725 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bgppl" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.741653 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bgppl"] Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.741764 4626 scope.go:117] "RemoveContainer" containerID="34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.751849 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bgppl"] Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.762084 4626 scope.go:117] "RemoveContainer" containerID="95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.802521 4626 scope.go:117] "RemoveContainer" containerID="4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066" Feb 23 08:47:19 crc kubenswrapper[4626]: E0223 08:47:19.804525 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066\": container with ID starting with 4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066 not found: ID does not exist" containerID="4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.804561 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066"} err="failed to get container status \"4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066\": rpc error: code = NotFound desc = could not find container \"4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066\": container with ID starting with 4660fcc72b67253770f7b6d832299d3451f8fe307d1c8b3268b0a5a8438f3066 not found: ID does not exist" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.804584 4626 scope.go:117] "RemoveContainer" containerID="34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb" Feb 23 08:47:19 crc kubenswrapper[4626]: E0223 08:47:19.806107 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb\": container with ID starting with 34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb not found: ID does not exist" containerID="34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.806132 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb"} err="failed to get container status \"34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb\": rpc error: code = NotFound desc = could not find container \"34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb\": container with ID starting with 34e9ad2d1f03a102560ecfb179ab51eb77e9ccde3fa4fe8eb5da3d33c81118bb not found: ID does not exist" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.806149 4626 scope.go:117] "RemoveContainer" containerID="95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc" Feb 23 08:47:19 crc kubenswrapper[4626]: E0223 08:47:19.806373 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc\": container with ID starting with 95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc not found: ID does not exist" containerID="95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.806403 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc"} err="failed to get container status \"95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc\": rpc error: code = NotFound desc = could not find container \"95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc\": container with ID starting with 95b74765d3025924d2ba70e8e0074c4c06aaf8127e6c4166f89ffdf7104164cc not found: ID does not exist" Feb 23 08:47:19 crc kubenswrapper[4626]: I0223 08:47:19.992678 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" path="/var/lib/kubelet/pods/b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b/volumes" Feb 23 08:48:25 crc kubenswrapper[4626]: I0223 08:48:25.686211 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:48:25 crc kubenswrapper[4626]: I0223 08:48:25.687875 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:48:55 crc kubenswrapper[4626]: I0223 08:48:55.685473 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:48:55 crc kubenswrapper[4626]: I0223 08:48:55.686096 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.685220 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.687007 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.687132 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.688663 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1c990c11eb054dd1040095f44f8c1b83025586bd94666ac10a73d029fd77a3f"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.688835 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://b1c990c11eb054dd1040095f44f8c1b83025586bd94666ac10a73d029fd77a3f" gracePeriod=600 Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.835627 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"b1c990c11eb054dd1040095f44f8c1b83025586bd94666ac10a73d029fd77a3f"} Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.835956 4626 scope.go:117] "RemoveContainer" containerID="4b003a096b4641d3e6e89c0bf2b42efe4777c31eb47e545228faf947970ab1d3" Feb 23 08:49:25 crc kubenswrapper[4626]: I0223 08:49:25.835889 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="b1c990c11eb054dd1040095f44f8c1b83025586bd94666ac10a73d029fd77a3f" exitCode=0 Feb 23 08:49:26 crc kubenswrapper[4626]: I0223 08:49:26.846532 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3"} Feb 23 08:51:25 crc kubenswrapper[4626]: I0223 08:51:25.685944 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:51:25 crc kubenswrapper[4626]: I0223 08:51:25.686741 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:51:55 crc kubenswrapper[4626]: I0223 08:51:55.685468 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:51:55 crc kubenswrapper[4626]: I0223 08:51:55.686142 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.308558 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t4v6z"] Feb 23 08:52:20 crc kubenswrapper[4626]: E0223 08:52:20.311488 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="registry-server" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.311532 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="registry-server" Feb 23 08:52:20 crc kubenswrapper[4626]: E0223 08:52:20.311556 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="extract-utilities" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.311564 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="extract-utilities" Feb 23 08:52:20 crc kubenswrapper[4626]: E0223 08:52:20.311599 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="extract-content" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.311605 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="extract-content" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.312197 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c872fc-e4b6-4f22-a21a-2f0bf3e7c94b" containerName="registry-server" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.315768 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.317899 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4v6z"] Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.492871 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-utilities\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.493194 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774zx\" (UniqueName: \"kubernetes.io/projected/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-kube-api-access-774zx\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.493341 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-catalog-content\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.596078 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-utilities\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.596135 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-774zx\" (UniqueName: \"kubernetes.io/projected/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-kube-api-access-774zx\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.596165 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-catalog-content\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.596907 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-catalog-content\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.596950 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-utilities\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.613296 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-774zx\" (UniqueName: \"kubernetes.io/projected/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-kube-api-access-774zx\") pod \"redhat-operators-t4v6z\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:20 crc kubenswrapper[4626]: I0223 08:52:20.636486 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:21 crc kubenswrapper[4626]: I0223 08:52:21.276098 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4v6z"] Feb 23 08:52:21 crc kubenswrapper[4626]: I0223 08:52:21.411289 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerStarted","Data":"1579f407e52fa513f7b37d899472741883bad2ff00701f00b9c6ecf9f9b05360"} Feb 23 08:52:22 crc kubenswrapper[4626]: I0223 08:52:22.419779 4626 generic.go:334] "Generic (PLEG): container finished" podID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerID="cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf" exitCode=0 Feb 23 08:52:22 crc kubenswrapper[4626]: I0223 08:52:22.419826 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerDied","Data":"cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf"} Feb 23 08:52:22 crc kubenswrapper[4626]: I0223 08:52:22.423322 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:52:23 crc kubenswrapper[4626]: I0223 08:52:23.429522 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerStarted","Data":"1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce"} Feb 23 08:52:25 crc kubenswrapper[4626]: I0223 08:52:25.685155 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:52:25 crc kubenswrapper[4626]: I0223 08:52:25.685354 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:52:25 crc kubenswrapper[4626]: I0223 08:52:25.685396 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 08:52:25 crc kubenswrapper[4626]: I0223 08:52:25.685978 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:52:25 crc kubenswrapper[4626]: I0223 08:52:25.686029 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" gracePeriod=600 Feb 23 08:52:26 crc kubenswrapper[4626]: E0223 08:52:26.906070 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:52:27 crc kubenswrapper[4626]: I0223 08:52:27.463412 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" exitCode=0 Feb 23 08:52:27 crc kubenswrapper[4626]: I0223 08:52:27.463467 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3"} Feb 23 08:52:27 crc kubenswrapper[4626]: I0223 08:52:27.463693 4626 scope.go:117] "RemoveContainer" containerID="b1c990c11eb054dd1040095f44f8c1b83025586bd94666ac10a73d029fd77a3f" Feb 23 08:52:27 crc kubenswrapper[4626]: I0223 08:52:27.464257 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:52:27 crc kubenswrapper[4626]: E0223 08:52:27.464662 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:52:27 crc kubenswrapper[4626]: I0223 08:52:27.465867 4626 generic.go:334] "Generic (PLEG): container finished" podID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerID="1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce" exitCode=0 Feb 23 08:52:27 crc kubenswrapper[4626]: I0223 08:52:27.465897 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerDied","Data":"1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce"} Feb 23 08:52:28 crc kubenswrapper[4626]: I0223 08:52:28.479308 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerStarted","Data":"0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4"} Feb 23 08:52:28 crc kubenswrapper[4626]: I0223 08:52:28.494998 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t4v6z" podStartSLOduration=2.991968353 podStartE2EDuration="8.494979027s" podCreationTimestamp="2026-02-23 08:52:20 +0000 UTC" firstStartedPulling="2026-02-23 08:52:22.422301242 +0000 UTC m=+7894.761630508" lastFinishedPulling="2026-02-23 08:52:27.925311916 +0000 UTC m=+7900.264641182" observedRunningTime="2026-02-23 08:52:28.492671787 +0000 UTC m=+7900.832001054" watchObservedRunningTime="2026-02-23 08:52:28.494979027 +0000 UTC m=+7900.834308294" Feb 23 08:52:30 crc kubenswrapper[4626]: I0223 08:52:30.637685 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:30 crc kubenswrapper[4626]: I0223 08:52:30.637990 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:31 crc kubenswrapper[4626]: I0223 08:52:31.671693 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4v6z" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="registry-server" probeResult="failure" output=< Feb 23 08:52:31 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:52:31 crc kubenswrapper[4626]: > Feb 23 08:52:41 crc kubenswrapper[4626]: I0223 08:52:41.669056 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4v6z" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="registry-server" probeResult="failure" output=< Feb 23 08:52:41 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 08:52:41 crc kubenswrapper[4626]: > Feb 23 08:52:41 crc kubenswrapper[4626]: I0223 08:52:41.982106 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:52:41 crc kubenswrapper[4626]: E0223 08:52:41.982384 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.362853 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-426nl"] Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.371281 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.375436 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-426nl"] Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.528004 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-utilities\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.528064 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wnz\" (UniqueName: \"kubernetes.io/projected/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-kube-api-access-s7wnz\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.528101 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-catalog-content\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.630047 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-catalog-content\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.630218 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-utilities\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.630256 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wnz\" (UniqueName: \"kubernetes.io/projected/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-kube-api-access-s7wnz\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.630520 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-catalog-content\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.630767 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-utilities\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.649117 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wnz\" (UniqueName: \"kubernetes.io/projected/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-kube-api-access-s7wnz\") pod \"redhat-marketplace-426nl\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:49 crc kubenswrapper[4626]: I0223 08:52:49.694660 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:50 crc kubenswrapper[4626]: I0223 08:52:50.112112 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-426nl"] Feb 23 08:52:50 crc kubenswrapper[4626]: I0223 08:52:50.636936 4626 generic.go:334] "Generic (PLEG): container finished" podID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerID="6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0" exitCode=0 Feb 23 08:52:50 crc kubenswrapper[4626]: I0223 08:52:50.637003 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerDied","Data":"6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0"} Feb 23 08:52:50 crc kubenswrapper[4626]: I0223 08:52:50.637154 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerStarted","Data":"cd93f8fbc85e98fefc2cd793b3693b619d51f5191af718cbeb4bfc9772026a09"} Feb 23 08:52:50 crc kubenswrapper[4626]: I0223 08:52:50.681699 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:50 crc kubenswrapper[4626]: I0223 08:52:50.719868 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:51 crc kubenswrapper[4626]: I0223 08:52:51.648088 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerStarted","Data":"9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1"} Feb 23 08:52:52 crc kubenswrapper[4626]: I0223 08:52:52.536015 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4v6z"] Feb 23 08:52:52 crc kubenswrapper[4626]: I0223 08:52:52.656141 4626 generic.go:334] "Generic (PLEG): container finished" podID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerID="9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1" exitCode=0 Feb 23 08:52:52 crc kubenswrapper[4626]: I0223 08:52:52.656236 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerDied","Data":"9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1"} Feb 23 08:52:52 crc kubenswrapper[4626]: I0223 08:52:52.657013 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t4v6z" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="registry-server" containerID="cri-o://0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4" gracePeriod=2 Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.070735 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.203942 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-catalog-content\") pod \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.204149 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-utilities\") pod \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.204186 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-774zx\" (UniqueName: \"kubernetes.io/projected/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-kube-api-access-774zx\") pod \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\" (UID: \"84c2c2a0-3bc4-4d9e-bb70-f191569f0873\") " Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.205314 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-utilities" (OuterVolumeSpecName: "utilities") pod "84c2c2a0-3bc4-4d9e-bb70-f191569f0873" (UID: "84c2c2a0-3bc4-4d9e-bb70-f191569f0873"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.212933 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-kube-api-access-774zx" (OuterVolumeSpecName: "kube-api-access-774zx") pod "84c2c2a0-3bc4-4d9e-bb70-f191569f0873" (UID: "84c2c2a0-3bc4-4d9e-bb70-f191569f0873"). InnerVolumeSpecName "kube-api-access-774zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.308648 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.308679 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-774zx\" (UniqueName: \"kubernetes.io/projected/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-kube-api-access-774zx\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.310191 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84c2c2a0-3bc4-4d9e-bb70-f191569f0873" (UID: "84c2c2a0-3bc4-4d9e-bb70-f191569f0873"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.410639 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c2c2a0-3bc4-4d9e-bb70-f191569f0873-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.665303 4626 generic.go:334] "Generic (PLEG): container finished" podID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerID="0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4" exitCode=0 Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.665357 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerDied","Data":"0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4"} Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.665403 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4v6z" event={"ID":"84c2c2a0-3bc4-4d9e-bb70-f191569f0873","Type":"ContainerDied","Data":"1579f407e52fa513f7b37d899472741883bad2ff00701f00b9c6ecf9f9b05360"} Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.665440 4626 scope.go:117] "RemoveContainer" containerID="0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.665361 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4v6z" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.670720 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerStarted","Data":"9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589"} Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.686080 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-426nl" podStartSLOduration=2.054496153 podStartE2EDuration="4.686068574s" podCreationTimestamp="2026-02-23 08:52:49 +0000 UTC" firstStartedPulling="2026-02-23 08:52:50.638546022 +0000 UTC m=+7922.977875288" lastFinishedPulling="2026-02-23 08:52:53.270118443 +0000 UTC m=+7925.609447709" observedRunningTime="2026-02-23 08:52:53.682697078 +0000 UTC m=+7926.022026343" watchObservedRunningTime="2026-02-23 08:52:53.686068574 +0000 UTC m=+7926.025397840" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.687519 4626 scope.go:117] "RemoveContainer" containerID="1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.706672 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4v6z"] Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.715584 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t4v6z"] Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.722949 4626 scope.go:117] "RemoveContainer" containerID="cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.736780 4626 scope.go:117] "RemoveContainer" containerID="0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4" Feb 23 08:52:53 crc kubenswrapper[4626]: E0223 08:52:53.739627 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4\": container with ID starting with 0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4 not found: ID does not exist" containerID="0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.740386 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4"} err="failed to get container status \"0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4\": rpc error: code = NotFound desc = could not find container \"0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4\": container with ID starting with 0286cfd1df5e6e2e2caf245abda3cafffd2bab03d85b862fa80c31188a7bb3a4 not found: ID does not exist" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.740432 4626 scope.go:117] "RemoveContainer" containerID="1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce" Feb 23 08:52:53 crc kubenswrapper[4626]: E0223 08:52:53.740755 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce\": container with ID starting with 1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce not found: ID does not exist" containerID="1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.740784 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce"} err="failed to get container status \"1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce\": rpc error: code = NotFound desc = could not find container \"1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce\": container with ID starting with 1ec78a7da9bebb5280658c6d6b7fd96ad1b8d432cb687e56217a7cb34e3b70ce not found: ID does not exist" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.740798 4626 scope.go:117] "RemoveContainer" containerID="cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf" Feb 23 08:52:53 crc kubenswrapper[4626]: E0223 08:52:53.741100 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf\": container with ID starting with cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf not found: ID does not exist" containerID="cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.741133 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf"} err="failed to get container status \"cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf\": rpc error: code = NotFound desc = could not find container \"cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf\": container with ID starting with cd8371f9830272a3de962e60e5d0fc7994301811db06e68c5db9bfbf0041fcaf not found: ID does not exist" Feb 23 08:52:53 crc kubenswrapper[4626]: I0223 08:52:53.991852 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" path="/var/lib/kubelet/pods/84c2c2a0-3bc4-4d9e-bb70-f191569f0873/volumes" Feb 23 08:52:56 crc kubenswrapper[4626]: I0223 08:52:56.981911 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:52:56 crc kubenswrapper[4626]: E0223 08:52:56.982331 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:52:59 crc kubenswrapper[4626]: I0223 08:52:59.695436 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:59 crc kubenswrapper[4626]: I0223 08:52:59.695831 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:59 crc kubenswrapper[4626]: I0223 08:52:59.732028 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:59 crc kubenswrapper[4626]: I0223 08:52:59.769964 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:52:59 crc kubenswrapper[4626]: I0223 08:52:59.958941 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-426nl"] Feb 23 08:53:01 crc kubenswrapper[4626]: I0223 08:53:01.725824 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-426nl" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="registry-server" containerID="cri-o://9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589" gracePeriod=2 Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.157282 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.177781 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wnz\" (UniqueName: \"kubernetes.io/projected/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-kube-api-access-s7wnz\") pod \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.177870 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-utilities\") pod \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.177894 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-catalog-content\") pod \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\" (UID: \"2bfe7840-8a61-4ae2-afdf-14ccf21816a9\") " Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.181093 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-utilities" (OuterVolumeSpecName: "utilities") pod "2bfe7840-8a61-4ae2-afdf-14ccf21816a9" (UID: "2bfe7840-8a61-4ae2-afdf-14ccf21816a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.185923 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-kube-api-access-s7wnz" (OuterVolumeSpecName: "kube-api-access-s7wnz") pod "2bfe7840-8a61-4ae2-afdf-14ccf21816a9" (UID: "2bfe7840-8a61-4ae2-afdf-14ccf21816a9"). InnerVolumeSpecName "kube-api-access-s7wnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.198874 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bfe7840-8a61-4ae2-afdf-14ccf21816a9" (UID: "2bfe7840-8a61-4ae2-afdf-14ccf21816a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.280741 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wnz\" (UniqueName: \"kubernetes.io/projected/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-kube-api-access-s7wnz\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.280774 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.280784 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfe7840-8a61-4ae2-afdf-14ccf21816a9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.734575 4626 generic.go:334] "Generic (PLEG): container finished" podID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerID="9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589" exitCode=0 Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.734669 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-426nl" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.734692 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerDied","Data":"9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589"} Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.734954 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-426nl" event={"ID":"2bfe7840-8a61-4ae2-afdf-14ccf21816a9","Type":"ContainerDied","Data":"cd93f8fbc85e98fefc2cd793b3693b619d51f5191af718cbeb4bfc9772026a09"} Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.734974 4626 scope.go:117] "RemoveContainer" containerID="9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.761438 4626 scope.go:117] "RemoveContainer" containerID="9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.761772 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-426nl"] Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.772345 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-426nl"] Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.782131 4626 scope.go:117] "RemoveContainer" containerID="6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.818219 4626 scope.go:117] "RemoveContainer" containerID="9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589" Feb 23 08:53:02 crc kubenswrapper[4626]: E0223 08:53:02.818638 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589\": container with ID starting with 9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589 not found: ID does not exist" containerID="9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.818674 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589"} err="failed to get container status \"9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589\": rpc error: code = NotFound desc = could not find container \"9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589\": container with ID starting with 9ff968e305e60bcc5bd9c68ce4c4e29496f55ae2811f133099795ca3b29a1589 not found: ID does not exist" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.818698 4626 scope.go:117] "RemoveContainer" containerID="9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1" Feb 23 08:53:02 crc kubenswrapper[4626]: E0223 08:53:02.818983 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1\": container with ID starting with 9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1 not found: ID does not exist" containerID="9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.819006 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1"} err="failed to get container status \"9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1\": rpc error: code = NotFound desc = could not find container \"9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1\": container with ID starting with 9e641ebeec2f75ec145e23ac7f76881d54a3dd66b27bc59ceacf5b7247f52ef1 not found: ID does not exist" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.819019 4626 scope.go:117] "RemoveContainer" containerID="6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0" Feb 23 08:53:02 crc kubenswrapper[4626]: E0223 08:53:02.819333 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0\": container with ID starting with 6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0 not found: ID does not exist" containerID="6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0" Feb 23 08:53:02 crc kubenswrapper[4626]: I0223 08:53:02.819354 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0"} err="failed to get container status \"6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0\": rpc error: code = NotFound desc = could not find container \"6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0\": container with ID starting with 6204a56a2315c6b57af9cd58e2f76ebaf59c7792eaf55155641176d24590bba0 not found: ID does not exist" Feb 23 08:53:03 crc kubenswrapper[4626]: I0223 08:53:03.990130 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" path="/var/lib/kubelet/pods/2bfe7840-8a61-4ae2-afdf-14ccf21816a9/volumes" Feb 23 08:53:11 crc kubenswrapper[4626]: I0223 08:53:11.982910 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:53:11 crc kubenswrapper[4626]: E0223 08:53:11.983771 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:53:22 crc kubenswrapper[4626]: I0223 08:53:22.981759 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:53:22 crc kubenswrapper[4626]: E0223 08:53:22.982420 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:53:35 crc kubenswrapper[4626]: I0223 08:53:35.981971 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:53:35 crc kubenswrapper[4626]: E0223 08:53:35.982918 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:53:50 crc kubenswrapper[4626]: I0223 08:53:50.982329 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:53:50 crc kubenswrapper[4626]: E0223 08:53:50.983440 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:54:03 crc kubenswrapper[4626]: I0223 08:54:03.982188 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:54:03 crc kubenswrapper[4626]: E0223 08:54:03.983246 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:54:17 crc kubenswrapper[4626]: I0223 08:54:17.981790 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:54:17 crc kubenswrapper[4626]: E0223 08:54:17.982688 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:54:32 crc kubenswrapper[4626]: I0223 08:54:32.983176 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:54:32 crc kubenswrapper[4626]: E0223 08:54:32.983799 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:54:44 crc kubenswrapper[4626]: I0223 08:54:44.981939 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:54:44 crc kubenswrapper[4626]: E0223 08:54:44.982676 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:54:58 crc kubenswrapper[4626]: I0223 08:54:58.982569 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:54:58 crc kubenswrapper[4626]: E0223 08:54:58.983427 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:55:09 crc kubenswrapper[4626]: I0223 08:55:09.982844 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:55:09 crc kubenswrapper[4626]: E0223 08:55:09.983830 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:55:24 crc kubenswrapper[4626]: I0223 08:55:24.984379 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:55:24 crc kubenswrapper[4626]: E0223 08:55:24.985014 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:55:37 crc kubenswrapper[4626]: I0223 08:55:37.989017 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:55:37 crc kubenswrapper[4626]: E0223 08:55:37.989938 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:55:51 crc kubenswrapper[4626]: I0223 08:55:51.982102 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:55:51 crc kubenswrapper[4626]: E0223 08:55:51.983161 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:56:06 crc kubenswrapper[4626]: I0223 08:56:06.981878 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:56:06 crc kubenswrapper[4626]: E0223 08:56:06.982632 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:56:18 crc kubenswrapper[4626]: I0223 08:56:18.983116 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:56:18 crc kubenswrapper[4626]: E0223 08:56:18.983828 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:56:33 crc kubenswrapper[4626]: I0223 08:56:33.982068 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:56:33 crc kubenswrapper[4626]: E0223 08:56:33.982986 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:56:46 crc kubenswrapper[4626]: I0223 08:56:46.982194 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:56:46 crc kubenswrapper[4626]: E0223 08:56:46.983002 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:56:59 crc kubenswrapper[4626]: I0223 08:56:59.983651 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:56:59 crc kubenswrapper[4626]: E0223 08:56:59.985061 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.379244 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l54mp"] Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.380544 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="extract-utilities" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.380565 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="extract-utilities" Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.380578 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="extract-content" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.380586 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="extract-content" Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.380601 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="extract-content" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.380609 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="extract-content" Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.380623 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="registry-server" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.380629 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="registry-server" Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.380658 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="extract-utilities" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.380666 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="extract-utilities" Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.380688 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="registry-server" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.380694 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="registry-server" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.381311 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c2c2a0-3bc4-4d9e-bb70-f191569f0873" containerName="registry-server" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.381357 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bfe7840-8a61-4ae2-afdf-14ccf21816a9" containerName="registry-server" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.383472 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.389607 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l54mp"] Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.482424 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw4kb\" (UniqueName: \"kubernetes.io/projected/2bc9ea76-5919-4729-a931-70b179b916cb-kube-api-access-tw4kb\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.482748 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-utilities\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.482874 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-catalog-content\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.585719 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-utilities\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.585812 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-catalog-content\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.586068 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw4kb\" (UniqueName: \"kubernetes.io/projected/2bc9ea76-5919-4729-a931-70b179b916cb-kube-api-access-tw4kb\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.587049 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-catalog-content\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.587186 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-utilities\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.614704 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw4kb\" (UniqueName: \"kubernetes.io/projected/2bc9ea76-5919-4729-a931-70b179b916cb-kube-api-access-tw4kb\") pod \"community-operators-l54mp\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.722300 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:10 crc kubenswrapper[4626]: I0223 08:57:10.982614 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:57:10 crc kubenswrapper[4626]: E0223 08:57:10.983226 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:57:11 crc kubenswrapper[4626]: I0223 08:57:11.209229 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l54mp"] Feb 23 08:57:12 crc kubenswrapper[4626]: I0223 08:57:12.004556 4626 generic.go:334] "Generic (PLEG): container finished" podID="2bc9ea76-5919-4729-a931-70b179b916cb" containerID="7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1" exitCode=0 Feb 23 08:57:12 crc kubenswrapper[4626]: I0223 08:57:12.004915 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerDied","Data":"7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1"} Feb 23 08:57:12 crc kubenswrapper[4626]: I0223 08:57:12.004947 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerStarted","Data":"ff2abba1724389d5a238d84c7d5fff17c5ca47270ef2c1c2490cc2164b03b07e"} Feb 23 08:57:13 crc kubenswrapper[4626]: I0223 08:57:13.016878 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerStarted","Data":"c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e"} Feb 23 08:57:14 crc kubenswrapper[4626]: I0223 08:57:14.031730 4626 generic.go:334] "Generic (PLEG): container finished" podID="2bc9ea76-5919-4729-a931-70b179b916cb" containerID="c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e" exitCode=0 Feb 23 08:57:14 crc kubenswrapper[4626]: I0223 08:57:14.032036 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerDied","Data":"c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e"} Feb 23 08:57:15 crc kubenswrapper[4626]: I0223 08:57:15.043275 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerStarted","Data":"c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f"} Feb 23 08:57:15 crc kubenswrapper[4626]: I0223 08:57:15.064533 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l54mp" podStartSLOduration=2.571839165 podStartE2EDuration="5.064512176s" podCreationTimestamp="2026-02-23 08:57:10 +0000 UTC" firstStartedPulling="2026-02-23 08:57:12.006693648 +0000 UTC m=+8184.346022914" lastFinishedPulling="2026-02-23 08:57:14.499366659 +0000 UTC m=+8186.838695925" observedRunningTime="2026-02-23 08:57:15.059177979 +0000 UTC m=+8187.398507245" watchObservedRunningTime="2026-02-23 08:57:15.064512176 +0000 UTC m=+8187.403841442" Feb 23 08:57:20 crc kubenswrapper[4626]: I0223 08:57:20.722350 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:20 crc kubenswrapper[4626]: I0223 08:57:20.722886 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:20 crc kubenswrapper[4626]: I0223 08:57:20.767881 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:21 crc kubenswrapper[4626]: I0223 08:57:21.143609 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:21 crc kubenswrapper[4626]: I0223 08:57:21.201101 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l54mp"] Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.124661 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l54mp" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="registry-server" containerID="cri-o://c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f" gracePeriod=2 Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.633878 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.703305 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw4kb\" (UniqueName: \"kubernetes.io/projected/2bc9ea76-5919-4729-a931-70b179b916cb-kube-api-access-tw4kb\") pod \"2bc9ea76-5919-4729-a931-70b179b916cb\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.703471 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-utilities\") pod \"2bc9ea76-5919-4729-a931-70b179b916cb\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.703799 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-catalog-content\") pod \"2bc9ea76-5919-4729-a931-70b179b916cb\" (UID: \"2bc9ea76-5919-4729-a931-70b179b916cb\") " Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.704961 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-utilities" (OuterVolumeSpecName: "utilities") pod "2bc9ea76-5919-4729-a931-70b179b916cb" (UID: "2bc9ea76-5919-4729-a931-70b179b916cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.706141 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.725256 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc9ea76-5919-4729-a931-70b179b916cb-kube-api-access-tw4kb" (OuterVolumeSpecName: "kube-api-access-tw4kb") pod "2bc9ea76-5919-4729-a931-70b179b916cb" (UID: "2bc9ea76-5919-4729-a931-70b179b916cb"). InnerVolumeSpecName "kube-api-access-tw4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.754963 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bc9ea76-5919-4729-a931-70b179b916cb" (UID: "2bc9ea76-5919-4729-a931-70b179b916cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.809167 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc9ea76-5919-4729-a931-70b179b916cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:23 crc kubenswrapper[4626]: I0223 08:57:23.809201 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw4kb\" (UniqueName: \"kubernetes.io/projected/2bc9ea76-5919-4729-a931-70b179b916cb-kube-api-access-tw4kb\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.142883 4626 generic.go:334] "Generic (PLEG): container finished" podID="2bc9ea76-5919-4729-a931-70b179b916cb" containerID="c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f" exitCode=0 Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.144214 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerDied","Data":"c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f"} Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.144384 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l54mp" event={"ID":"2bc9ea76-5919-4729-a931-70b179b916cb","Type":"ContainerDied","Data":"ff2abba1724389d5a238d84c7d5fff17c5ca47270ef2c1c2490cc2164b03b07e"} Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.144458 4626 scope.go:117] "RemoveContainer" containerID="c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.144754 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l54mp" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.178975 4626 scope.go:117] "RemoveContainer" containerID="c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.193825 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l54mp"] Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.200337 4626 scope.go:117] "RemoveContainer" containerID="7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.206643 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l54mp"] Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.236921 4626 scope.go:117] "RemoveContainer" containerID="c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f" Feb 23 08:57:24 crc kubenswrapper[4626]: E0223 08:57:24.237307 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f\": container with ID starting with c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f not found: ID does not exist" containerID="c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.237354 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f"} err="failed to get container status \"c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f\": rpc error: code = NotFound desc = could not find container \"c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f\": container with ID starting with c0c5483100f339b605b757fde6a3fa2cb8cd9bd11f79443f2363eac42938cf2f not found: ID does not exist" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.237390 4626 scope.go:117] "RemoveContainer" containerID="c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e" Feb 23 08:57:24 crc kubenswrapper[4626]: E0223 08:57:24.237682 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e\": container with ID starting with c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e not found: ID does not exist" containerID="c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.237711 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e"} err="failed to get container status \"c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e\": rpc error: code = NotFound desc = could not find container \"c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e\": container with ID starting with c2655104dfd02d4558d20262ee4e34bea009947f8cdf4f1c32260a4425e8ac7e not found: ID does not exist" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.237730 4626 scope.go:117] "RemoveContainer" containerID="7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1" Feb 23 08:57:24 crc kubenswrapper[4626]: E0223 08:57:24.237987 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1\": container with ID starting with 7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1 not found: ID does not exist" containerID="7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.238011 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1"} err="failed to get container status \"7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1\": rpc error: code = NotFound desc = could not find container \"7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1\": container with ID starting with 7dc8effa3c78351b53878216ffc60550ca7831c5dc1f736f0a650f6218969ba1 not found: ID does not exist" Feb 23 08:57:24 crc kubenswrapper[4626]: I0223 08:57:24.982912 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:57:24 crc kubenswrapper[4626]: E0223 08:57:24.983523 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 08:57:25 crc kubenswrapper[4626]: I0223 08:57:25.992401 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" path="/var/lib/kubelet/pods/2bc9ea76-5919-4729-a931-70b179b916cb/volumes" Feb 23 08:57:38 crc kubenswrapper[4626]: I0223 08:57:38.982335 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.279724 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"ef0ef59bd5d386dff1c12a966d729743ec224ce7c0108b3374bbed5f762a9739"} Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.454408 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-647d8d576f-jnrsm"] Feb 23 08:57:39 crc kubenswrapper[4626]: E0223 08:57:39.454840 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="extract-utilities" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.454895 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="extract-utilities" Feb 23 08:57:39 crc kubenswrapper[4626]: E0223 08:57:39.454920 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="extract-content" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.454927 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="extract-content" Feb 23 08:57:39 crc kubenswrapper[4626]: E0223 08:57:39.454948 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="registry-server" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.454954 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="registry-server" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.455136 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc9ea76-5919-4729-a931-70b179b916cb" containerName="registry-server" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.456048 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.501368 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647d8d576f-jnrsm"] Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548197 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-httpd-config\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548245 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-public-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548279 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-combined-ca-bundle\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548328 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-internal-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548349 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-ovndb-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548441 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-config\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.548475 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmd92\" (UniqueName: \"kubernetes.io/projected/f3b2c5c3-7e78-46b9-8365-396752a27b88-kube-api-access-wmd92\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.650631 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-config\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.651318 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmd92\" (UniqueName: \"kubernetes.io/projected/f3b2c5c3-7e78-46b9-8365-396752a27b88-kube-api-access-wmd92\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.651517 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-httpd-config\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.651636 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-public-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.651758 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-combined-ca-bundle\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.651844 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-internal-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.651927 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-ovndb-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.662849 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-combined-ca-bundle\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.662945 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-httpd-config\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.663488 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-config\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.670789 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-ovndb-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.674435 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-public-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.677174 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmd92\" (UniqueName: \"kubernetes.io/projected/f3b2c5c3-7e78-46b9-8365-396752a27b88-kube-api-access-wmd92\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.681161 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3b2c5c3-7e78-46b9-8365-396752a27b88-internal-tls-certs\") pod \"neutron-647d8d576f-jnrsm\" (UID: \"f3b2c5c3-7e78-46b9-8365-396752a27b88\") " pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:39 crc kubenswrapper[4626]: I0223 08:57:39.774804 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:40 crc kubenswrapper[4626]: I0223 08:57:40.792937 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647d8d576f-jnrsm"] Feb 23 08:57:41 crc kubenswrapper[4626]: I0223 08:57:41.301884 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647d8d576f-jnrsm" event={"ID":"f3b2c5c3-7e78-46b9-8365-396752a27b88","Type":"ContainerStarted","Data":"8e7514809ee404c06ed790a7a8e2345dfef58e06dec33463bfc0c561bb913fa8"} Feb 23 08:57:41 crc kubenswrapper[4626]: I0223 08:57:41.302184 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647d8d576f-jnrsm" event={"ID":"f3b2c5c3-7e78-46b9-8365-396752a27b88","Type":"ContainerStarted","Data":"57ac709eefe8b89273d8fffbf231e023535903beb116f939029aba65c1ce291d"} Feb 23 08:57:41 crc kubenswrapper[4626]: I0223 08:57:41.302196 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647d8d576f-jnrsm" event={"ID":"f3b2c5c3-7e78-46b9-8365-396752a27b88","Type":"ContainerStarted","Data":"4325846415e54b026de1f501297815271e63d39757bcbf44cae2e7c0f95f11ec"} Feb 23 08:57:41 crc kubenswrapper[4626]: I0223 08:57:41.302211 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:57:41 crc kubenswrapper[4626]: I0223 08:57:41.325808 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-647d8d576f-jnrsm" podStartSLOduration=2.325790589 podStartE2EDuration="2.325790589s" podCreationTimestamp="2026-02-23 08:57:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:57:41.317438613 +0000 UTC m=+8213.656767878" watchObservedRunningTime="2026-02-23 08:57:41.325790589 +0000 UTC m=+8213.665119856" Feb 23 08:58:09 crc kubenswrapper[4626]: I0223 08:58:09.790618 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-647d8d576f-jnrsm" Feb 23 08:58:09 crc kubenswrapper[4626]: I0223 08:58:09.870889 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79dfdd6449-flhwk"] Feb 23 08:58:09 crc kubenswrapper[4626]: I0223 08:58:09.871118 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79dfdd6449-flhwk" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-api" containerID="cri-o://32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce" gracePeriod=30 Feb 23 08:58:09 crc kubenswrapper[4626]: I0223 08:58:09.871574 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79dfdd6449-flhwk" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-httpd" containerID="cri-o://c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b" gracePeriod=30 Feb 23 08:58:10 crc kubenswrapper[4626]: I0223 08:58:10.582611 4626 generic.go:334] "Generic (PLEG): container finished" podID="058dce91-7662-43cb-bcda-d7d44e59029b" containerID="c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b" exitCode=0 Feb 23 08:58:10 crc kubenswrapper[4626]: I0223 08:58:10.582707 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79dfdd6449-flhwk" event={"ID":"058dce91-7662-43cb-bcda-d7d44e59029b","Type":"ContainerDied","Data":"c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b"} Feb 23 08:58:12 crc kubenswrapper[4626]: I0223 08:58:12.213733 4626 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79dfdd6449-flhwk" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.16:9696/\": dial tcp 10.217.1.16:9696: connect: connection refused" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.313763 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.481061 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-combined-ca-bundle\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.481291 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-internal-tls-certs\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.481639 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-ovndb-tls-certs\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.481797 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.482223 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2fn7\" (UniqueName: \"kubernetes.io/projected/058dce91-7662-43cb-bcda-d7d44e59029b-kube-api-access-d2fn7\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.482933 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-config\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.483108 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-httpd-config\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.493044 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.493068 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058dce91-7662-43cb-bcda-d7d44e59029b-kube-api-access-d2fn7" (OuterVolumeSpecName: "kube-api-access-d2fn7") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "kube-api-access-d2fn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.565431 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-config" (OuterVolumeSpecName: "config") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.568907 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.586866 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.588405 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs\") pod \"058dce91-7662-43cb-bcda-d7d44e59029b\" (UID: \"058dce91-7662-43cb-bcda-d7d44e59029b\") " Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.589561 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.589819 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2fn7\" (UniqueName: \"kubernetes.io/projected/058dce91-7662-43cb-bcda-d7d44e59029b-kube-api-access-d2fn7\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.589841 4626 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.589852 4626 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.589863 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.589872 4626 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: W0223 08:58:15.590128 4626 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/058dce91-7662-43cb-bcda-d7d44e59029b/volumes/kubernetes.io~secret/public-tls-certs Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.590487 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.602838 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "058dce91-7662-43cb-bcda-d7d44e59029b" (UID: "058dce91-7662-43cb-bcda-d7d44e59029b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.630597 4626 generic.go:334] "Generic (PLEG): container finished" podID="058dce91-7662-43cb-bcda-d7d44e59029b" containerID="32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce" exitCode=0 Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.630753 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79dfdd6449-flhwk" event={"ID":"058dce91-7662-43cb-bcda-d7d44e59029b","Type":"ContainerDied","Data":"32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce"} Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.630865 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79dfdd6449-flhwk" event={"ID":"058dce91-7662-43cb-bcda-d7d44e59029b","Type":"ContainerDied","Data":"be08365417306c286e226e76de12635afc2e8f8a43c258620915a2e942f6b25e"} Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.630762 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79dfdd6449-flhwk" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.630933 4626 scope.go:117] "RemoveContainer" containerID="c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.666878 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79dfdd6449-flhwk"] Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.671428 4626 scope.go:117] "RemoveContainer" containerID="32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.674039 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79dfdd6449-flhwk"] Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.698756 4626 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.701471 4626 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/058dce91-7662-43cb-bcda-d7d44e59029b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.711578 4626 scope.go:117] "RemoveContainer" containerID="c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b" Feb 23 08:58:15 crc kubenswrapper[4626]: E0223 08:58:15.713427 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b\": container with ID starting with c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b not found: ID does not exist" containerID="c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.713589 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b"} err="failed to get container status \"c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b\": rpc error: code = NotFound desc = could not find container \"c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b\": container with ID starting with c6a5e6555020603a620f454ea38ded98b3d1434ceb26edbe5666225e5742059b not found: ID does not exist" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.713695 4626 scope.go:117] "RemoveContainer" containerID="32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce" Feb 23 08:58:15 crc kubenswrapper[4626]: E0223 08:58:15.714242 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce\": container with ID starting with 32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce not found: ID does not exist" containerID="32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.714350 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce"} err="failed to get container status \"32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce\": rpc error: code = NotFound desc = could not find container \"32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce\": container with ID starting with 32c90a727cbb281f2befaea8ed051a18a36330bb50c6f3eeb5d8c552fff89cce not found: ID does not exist" Feb 23 08:58:15 crc kubenswrapper[4626]: I0223 08:58:15.993145 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" path="/var/lib/kubelet/pods/058dce91-7662-43cb-bcda-d7d44e59029b/volumes" Feb 23 08:59:34 crc kubenswrapper[4626]: E0223 08:59:34.404581 4626 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.58:60228->192.168.26.58:46805: read tcp 192.168.26.58:60228->192.168.26.58:46805: read: connection reset by peer Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.836066 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j9pst"] Feb 23 08:59:50 crc kubenswrapper[4626]: E0223 08:59:50.837466 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-api" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.837490 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-api" Feb 23 08:59:50 crc kubenswrapper[4626]: E0223 08:59:50.837532 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-httpd" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.837538 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-httpd" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.837778 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-httpd" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.837793 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="058dce91-7662-43cb-bcda-d7d44e59029b" containerName="neutron-api" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.839590 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.863347 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9pst"] Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.935355 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-utilities\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.935555 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6qz\" (UniqueName: \"kubernetes.io/projected/b9dc8ed6-ae2b-4159-9019-b828f9440a09-kube-api-access-sq6qz\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:50 crc kubenswrapper[4626]: I0223 08:59:50.935853 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-catalog-content\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.038364 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6qz\" (UniqueName: \"kubernetes.io/projected/b9dc8ed6-ae2b-4159-9019-b828f9440a09-kube-api-access-sq6qz\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.038982 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-catalog-content\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.039530 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-utilities\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.040432 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-utilities\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.041257 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-catalog-content\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.063445 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6qz\" (UniqueName: \"kubernetes.io/projected/b9dc8ed6-ae2b-4159-9019-b828f9440a09-kube-api-access-sq6qz\") pod \"certified-operators-j9pst\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.160114 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 08:59:51 crc kubenswrapper[4626]: I0223 08:59:51.593951 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9pst"] Feb 23 08:59:52 crc kubenswrapper[4626]: I0223 08:59:52.529079 4626 generic.go:334] "Generic (PLEG): container finished" podID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerID="4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427" exitCode=0 Feb 23 08:59:52 crc kubenswrapper[4626]: I0223 08:59:52.529210 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerDied","Data":"4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427"} Feb 23 08:59:52 crc kubenswrapper[4626]: I0223 08:59:52.529523 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerStarted","Data":"3a6e7ea31ec7c1091296ced06da3e02404200759948759f985b2e2b28b7d4ce6"} Feb 23 08:59:52 crc kubenswrapper[4626]: I0223 08:59:52.534707 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:59:53 crc kubenswrapper[4626]: I0223 08:59:53.540704 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerStarted","Data":"51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13"} Feb 23 08:59:55 crc kubenswrapper[4626]: I0223 08:59:55.581912 4626 generic.go:334] "Generic (PLEG): container finished" podID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerID="51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13" exitCode=0 Feb 23 08:59:55 crc kubenswrapper[4626]: I0223 08:59:55.582464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerDied","Data":"51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13"} Feb 23 08:59:55 crc kubenswrapper[4626]: I0223 08:59:55.685315 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:59:55 crc kubenswrapper[4626]: I0223 08:59:55.687042 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:59:56 crc kubenswrapper[4626]: I0223 08:59:56.594164 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerStarted","Data":"d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97"} Feb 23 08:59:56 crc kubenswrapper[4626]: I0223 08:59:56.617691 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j9pst" podStartSLOduration=3.104354436 podStartE2EDuration="6.617669102s" podCreationTimestamp="2026-02-23 08:59:50 +0000 UTC" firstStartedPulling="2026-02-23 08:59:52.533472945 +0000 UTC m=+8344.872802211" lastFinishedPulling="2026-02-23 08:59:56.046787621 +0000 UTC m=+8348.386116877" observedRunningTime="2026-02-23 08:59:56.617575716 +0000 UTC m=+8348.956904982" watchObservedRunningTime="2026-02-23 08:59:56.617669102 +0000 UTC m=+8348.956998368" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.182000 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7"] Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.184557 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.197184 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzfrv\" (UniqueName: \"kubernetes.io/projected/2d6552c5-b9c7-4908-a010-1f965b4f278b-kube-api-access-kzfrv\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.197259 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d6552c5-b9c7-4908-a010-1f965b4f278b-config-volume\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.197286 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d6552c5-b9c7-4908-a010-1f965b4f278b-secret-volume\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.198406 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.199100 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.205818 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7"] Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.299038 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzfrv\" (UniqueName: \"kubernetes.io/projected/2d6552c5-b9c7-4908-a010-1f965b4f278b-kube-api-access-kzfrv\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.299103 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d6552c5-b9c7-4908-a010-1f965b4f278b-config-volume\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.299126 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d6552c5-b9c7-4908-a010-1f965b4f278b-secret-volume\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.300056 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d6552c5-b9c7-4908-a010-1f965b4f278b-config-volume\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.308991 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d6552c5-b9c7-4908-a010-1f965b4f278b-secret-volume\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.315650 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzfrv\" (UniqueName: \"kubernetes.io/projected/2d6552c5-b9c7-4908-a010-1f965b4f278b-kube-api-access-kzfrv\") pod \"collect-profiles-29530620-24fr7\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:00 crc kubenswrapper[4626]: I0223 09:00:00.522532 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:01 crc kubenswrapper[4626]: W0223 09:00:01.030378 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d6552c5_b9c7_4908_a010_1f965b4f278b.slice/crio-ecaa4229539d17d3742a121acd7b86867dcf5fa394cafdea420d19ea5a59735b WatchSource:0}: Error finding container ecaa4229539d17d3742a121acd7b86867dcf5fa394cafdea420d19ea5a59735b: Status 404 returned error can't find the container with id ecaa4229539d17d3742a121acd7b86867dcf5fa394cafdea420d19ea5a59735b Feb 23 09:00:01 crc kubenswrapper[4626]: I0223 09:00:01.039369 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7"] Feb 23 09:00:01 crc kubenswrapper[4626]: I0223 09:00:01.160838 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 09:00:01 crc kubenswrapper[4626]: I0223 09:00:01.161102 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 09:00:01 crc kubenswrapper[4626]: I0223 09:00:01.641581 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" event={"ID":"2d6552c5-b9c7-4908-a010-1f965b4f278b","Type":"ContainerStarted","Data":"5fcc384ba187c3b389ff16486bee3d9bd438d3e136eb140c7e52e22d0d865574"} Feb 23 09:00:01 crc kubenswrapper[4626]: I0223 09:00:01.641640 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" event={"ID":"2d6552c5-b9c7-4908-a010-1f965b4f278b","Type":"ContainerStarted","Data":"ecaa4229539d17d3742a121acd7b86867dcf5fa394cafdea420d19ea5a59735b"} Feb 23 09:00:01 crc kubenswrapper[4626]: I0223 09:00:01.668301 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" podStartSLOduration=1.66828486 podStartE2EDuration="1.66828486s" podCreationTimestamp="2026-02-23 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:00:01.660634335 +0000 UTC m=+8353.999963602" watchObservedRunningTime="2026-02-23 09:00:01.66828486 +0000 UTC m=+8354.007614127" Feb 23 09:00:02 crc kubenswrapper[4626]: I0223 09:00:02.210792 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-j9pst" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="registry-server" probeResult="failure" output=< Feb 23 09:00:02 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:00:02 crc kubenswrapper[4626]: > Feb 23 09:00:02 crc kubenswrapper[4626]: I0223 09:00:02.654034 4626 generic.go:334] "Generic (PLEG): container finished" podID="2d6552c5-b9c7-4908-a010-1f965b4f278b" containerID="5fcc384ba187c3b389ff16486bee3d9bd438d3e136eb140c7e52e22d0d865574" exitCode=0 Feb 23 09:00:02 crc kubenswrapper[4626]: I0223 09:00:02.654092 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" event={"ID":"2d6552c5-b9c7-4908-a010-1f965b4f278b","Type":"ContainerDied","Data":"5fcc384ba187c3b389ff16486bee3d9bd438d3e136eb140c7e52e22d0d865574"} Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.034553 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.206529 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzfrv\" (UniqueName: \"kubernetes.io/projected/2d6552c5-b9c7-4908-a010-1f965b4f278b-kube-api-access-kzfrv\") pod \"2d6552c5-b9c7-4908-a010-1f965b4f278b\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.206586 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d6552c5-b9c7-4908-a010-1f965b4f278b-config-volume\") pod \"2d6552c5-b9c7-4908-a010-1f965b4f278b\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.206822 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d6552c5-b9c7-4908-a010-1f965b4f278b-secret-volume\") pod \"2d6552c5-b9c7-4908-a010-1f965b4f278b\" (UID: \"2d6552c5-b9c7-4908-a010-1f965b4f278b\") " Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.207464 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6552c5-b9c7-4908-a010-1f965b4f278b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2d6552c5-b9c7-4908-a010-1f965b4f278b" (UID: "2d6552c5-b9c7-4908-a010-1f965b4f278b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.208056 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d6552c5-b9c7-4908-a010-1f965b4f278b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.214034 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6552c5-b9c7-4908-a010-1f965b4f278b-kube-api-access-kzfrv" (OuterVolumeSpecName: "kube-api-access-kzfrv") pod "2d6552c5-b9c7-4908-a010-1f965b4f278b" (UID: "2d6552c5-b9c7-4908-a010-1f965b4f278b"). InnerVolumeSpecName "kube-api-access-kzfrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.214422 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6552c5-b9c7-4908-a010-1f965b4f278b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2d6552c5-b9c7-4908-a010-1f965b4f278b" (UID: "2d6552c5-b9c7-4908-a010-1f965b4f278b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.310149 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d6552c5-b9c7-4908-a010-1f965b4f278b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.310188 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzfrv\" (UniqueName: \"kubernetes.io/projected/2d6552c5-b9c7-4908-a010-1f965b4f278b-kube-api-access-kzfrv\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.694158 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" event={"ID":"2d6552c5-b9c7-4908-a010-1f965b4f278b","Type":"ContainerDied","Data":"ecaa4229539d17d3742a121acd7b86867dcf5fa394cafdea420d19ea5a59735b"} Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.694214 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecaa4229539d17d3742a121acd7b86867dcf5fa394cafdea420d19ea5a59735b" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.694764 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-24fr7" Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.754540 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj"] Feb 23 09:00:04 crc kubenswrapper[4626]: I0223 09:00:04.764118 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-8hxfj"] Feb 23 09:00:05 crc kubenswrapper[4626]: I0223 09:00:05.992463 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c354d1-c9bb-4808-b7ea-a63b849e5c77" path="/var/lib/kubelet/pods/55c354d1-c9bb-4808-b7ea-a63b849e5c77/volumes" Feb 23 09:00:11 crc kubenswrapper[4626]: I0223 09:00:11.204577 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 09:00:11 crc kubenswrapper[4626]: I0223 09:00:11.258385 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 09:00:11 crc kubenswrapper[4626]: I0223 09:00:11.439357 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9pst"] Feb 23 09:00:12 crc kubenswrapper[4626]: I0223 09:00:12.763981 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j9pst" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="registry-server" containerID="cri-o://d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97" gracePeriod=2 Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.508992 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.527139 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-catalog-content\") pod \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.527255 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6qz\" (UniqueName: \"kubernetes.io/projected/b9dc8ed6-ae2b-4159-9019-b828f9440a09-kube-api-access-sq6qz\") pod \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.527402 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-utilities\") pod \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\" (UID: \"b9dc8ed6-ae2b-4159-9019-b828f9440a09\") " Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.527879 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-utilities" (OuterVolumeSpecName: "utilities") pod "b9dc8ed6-ae2b-4159-9019-b828f9440a09" (UID: "b9dc8ed6-ae2b-4159-9019-b828f9440a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.528632 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.538350 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9dc8ed6-ae2b-4159-9019-b828f9440a09-kube-api-access-sq6qz" (OuterVolumeSpecName: "kube-api-access-sq6qz") pod "b9dc8ed6-ae2b-4159-9019-b828f9440a09" (UID: "b9dc8ed6-ae2b-4159-9019-b828f9440a09"). InnerVolumeSpecName "kube-api-access-sq6qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.595989 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9dc8ed6-ae2b-4159-9019-b828f9440a09" (UID: "b9dc8ed6-ae2b-4159-9019-b828f9440a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.631710 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed6-ae2b-4159-9019-b828f9440a09-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.631748 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6qz\" (UniqueName: \"kubernetes.io/projected/b9dc8ed6-ae2b-4159-9019-b828f9440a09-kube-api-access-sq6qz\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.864302 4626 generic.go:334] "Generic (PLEG): container finished" podID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerID="d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97" exitCode=0 Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.864388 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerDied","Data":"d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97"} Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.864439 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9pst" event={"ID":"b9dc8ed6-ae2b-4159-9019-b828f9440a09","Type":"ContainerDied","Data":"3a6e7ea31ec7c1091296ced06da3e02404200759948759f985b2e2b28b7d4ce6"} Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.864514 4626 scope.go:117] "RemoveContainer" containerID="d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.864880 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9pst" Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.955984 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9pst"] Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.965093 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j9pst"] Feb 23 09:00:13 crc kubenswrapper[4626]: I0223 09:00:13.977755 4626 scope.go:117] "RemoveContainer" containerID="51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.018188 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" path="/var/lib/kubelet/pods/b9dc8ed6-ae2b-4159-9019-b828f9440a09/volumes" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.018362 4626 scope.go:117] "RemoveContainer" containerID="4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.062985 4626 scope.go:117] "RemoveContainer" containerID="d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97" Feb 23 09:00:14 crc kubenswrapper[4626]: E0223 09:00:14.063826 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97\": container with ID starting with d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97 not found: ID does not exist" containerID="d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.063880 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97"} err="failed to get container status \"d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97\": rpc error: code = NotFound desc = could not find container \"d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97\": container with ID starting with d9cd420b6f536cdecfa7078cb3bfc031edf9cc8a89462a4dd498b39b5f6c5d97 not found: ID does not exist" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.063915 4626 scope.go:117] "RemoveContainer" containerID="51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13" Feb 23 09:00:14 crc kubenswrapper[4626]: E0223 09:00:14.064512 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13\": container with ID starting with 51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13 not found: ID does not exist" containerID="51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.064569 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13"} err="failed to get container status \"51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13\": rpc error: code = NotFound desc = could not find container \"51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13\": container with ID starting with 51c1ead7c51f56028a435a80729bdcea5d370711020c8dc46d1833aae773cf13 not found: ID does not exist" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.064601 4626 scope.go:117] "RemoveContainer" containerID="4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427" Feb 23 09:00:14 crc kubenswrapper[4626]: E0223 09:00:14.064941 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427\": container with ID starting with 4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427 not found: ID does not exist" containerID="4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427" Feb 23 09:00:14 crc kubenswrapper[4626]: I0223 09:00:14.064965 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427"} err="failed to get container status \"4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427\": rpc error: code = NotFound desc = could not find container \"4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427\": container with ID starting with 4ac33b2c78aa3b1380c953241095a59f1665d629e87f0f05717d606f09b43427 not found: ID does not exist" Feb 23 09:00:24 crc kubenswrapper[4626]: I0223 09:00:24.651436 4626 scope.go:117] "RemoveContainer" containerID="6554d67d66e6cf9055b893ce40e0039ffcd768076076c6a6f50e31868a739073" Feb 23 09:00:25 crc kubenswrapper[4626]: I0223 09:00:25.685965 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:00:25 crc kubenswrapper[4626]: I0223 09:00:25.686417 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:00:55 crc kubenswrapper[4626]: I0223 09:00:55.685752 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:00:55 crc kubenswrapper[4626]: I0223 09:00:55.686326 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:00:55 crc kubenswrapper[4626]: I0223 09:00:55.686384 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:00:55 crc kubenswrapper[4626]: I0223 09:00:55.687948 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef0ef59bd5d386dff1c12a966d729743ec224ce7c0108b3374bbed5f762a9739"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:00:55 crc kubenswrapper[4626]: I0223 09:00:55.688034 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://ef0ef59bd5d386dff1c12a966d729743ec224ce7c0108b3374bbed5f762a9739" gracePeriod=600 Feb 23 09:00:56 crc kubenswrapper[4626]: I0223 09:00:56.292985 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="ef0ef59bd5d386dff1c12a966d729743ec224ce7c0108b3374bbed5f762a9739" exitCode=0 Feb 23 09:00:56 crc kubenswrapper[4626]: I0223 09:00:56.293296 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"ef0ef59bd5d386dff1c12a966d729743ec224ce7c0108b3374bbed5f762a9739"} Feb 23 09:00:56 crc kubenswrapper[4626]: I0223 09:00:56.293352 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c"} Feb 23 09:00:56 crc kubenswrapper[4626]: I0223 09:00:56.293375 4626 scope.go:117] "RemoveContainer" containerID="2906a5d1e081d9cc34a7cc783280aff6e3532da42f57ab7ea0f3884e093c34a3" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.171934 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530621-vb4gm"] Feb 23 09:01:00 crc kubenswrapper[4626]: E0223 09:01:00.173832 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="extract-utilities" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.173909 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="extract-utilities" Feb 23 09:01:00 crc kubenswrapper[4626]: E0223 09:01:00.173983 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="registry-server" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.174034 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="registry-server" Feb 23 09:01:00 crc kubenswrapper[4626]: E0223 09:01:00.174129 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="extract-content" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.174182 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="extract-content" Feb 23 09:01:00 crc kubenswrapper[4626]: E0223 09:01:00.174238 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6552c5-b9c7-4908-a010-1f965b4f278b" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.174293 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6552c5-b9c7-4908-a010-1f965b4f278b" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.175058 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6552c5-b9c7-4908-a010-1f965b4f278b" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.175160 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9dc8ed6-ae2b-4159-9019-b828f9440a09" containerName="registry-server" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.177738 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.197345 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530621-vb4gm"] Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.244676 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxwd7\" (UniqueName: \"kubernetes.io/projected/c637caea-745d-4de8-9111-addf155f30c3-kube-api-access-wxwd7\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.244879 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-config-data\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.245080 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-combined-ca-bundle\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.245196 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-fernet-keys\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.346956 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-fernet-keys\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.347204 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxwd7\" (UniqueName: \"kubernetes.io/projected/c637caea-745d-4de8-9111-addf155f30c3-kube-api-access-wxwd7\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.347274 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-config-data\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.347355 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-combined-ca-bundle\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.360262 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-combined-ca-bundle\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.360642 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-config-data\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.360643 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-fernet-keys\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.376757 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxwd7\" (UniqueName: \"kubernetes.io/projected/c637caea-745d-4de8-9111-addf155f30c3-kube-api-access-wxwd7\") pod \"keystone-cron-29530621-vb4gm\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:00 crc kubenswrapper[4626]: I0223 09:01:00.494546 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:01 crc kubenswrapper[4626]: I0223 09:01:01.037033 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530621-vb4gm"] Feb 23 09:01:01 crc kubenswrapper[4626]: I0223 09:01:01.365991 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-vb4gm" event={"ID":"c637caea-745d-4de8-9111-addf155f30c3","Type":"ContainerStarted","Data":"5f78141fb9157606512c34be2516df73415f5dba7636169d29600c6b22cd3c12"} Feb 23 09:01:01 crc kubenswrapper[4626]: I0223 09:01:01.366282 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-vb4gm" event={"ID":"c637caea-745d-4de8-9111-addf155f30c3","Type":"ContainerStarted","Data":"96ac6910ce35f32ae1c0f9c36e1abfdd0a3c7e2c6aee0bd68afc1adac7f6748b"} Feb 23 09:01:01 crc kubenswrapper[4626]: I0223 09:01:01.390545 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29530621-vb4gm" podStartSLOduration=1.390526814 podStartE2EDuration="1.390526814s" podCreationTimestamp="2026-02-23 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:01:01.386824163 +0000 UTC m=+8413.726153430" watchObservedRunningTime="2026-02-23 09:01:01.390526814 +0000 UTC m=+8413.729856080" Feb 23 09:01:04 crc kubenswrapper[4626]: I0223 09:01:04.398613 4626 generic.go:334] "Generic (PLEG): container finished" podID="c637caea-745d-4de8-9111-addf155f30c3" containerID="5f78141fb9157606512c34be2516df73415f5dba7636169d29600c6b22cd3c12" exitCode=0 Feb 23 09:01:04 crc kubenswrapper[4626]: I0223 09:01:04.398705 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-vb4gm" event={"ID":"c637caea-745d-4de8-9111-addf155f30c3","Type":"ContainerDied","Data":"5f78141fb9157606512c34be2516df73415f5dba7636169d29600c6b22cd3c12"} Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.748853 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.873844 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-fernet-keys\") pod \"c637caea-745d-4de8-9111-addf155f30c3\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.873981 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-combined-ca-bundle\") pod \"c637caea-745d-4de8-9111-addf155f30c3\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.874148 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxwd7\" (UniqueName: \"kubernetes.io/projected/c637caea-745d-4de8-9111-addf155f30c3-kube-api-access-wxwd7\") pod \"c637caea-745d-4de8-9111-addf155f30c3\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.874180 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-config-data\") pod \"c637caea-745d-4de8-9111-addf155f30c3\" (UID: \"c637caea-745d-4de8-9111-addf155f30c3\") " Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.884775 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c637caea-745d-4de8-9111-addf155f30c3" (UID: "c637caea-745d-4de8-9111-addf155f30c3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.885029 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c637caea-745d-4de8-9111-addf155f30c3-kube-api-access-wxwd7" (OuterVolumeSpecName: "kube-api-access-wxwd7") pod "c637caea-745d-4de8-9111-addf155f30c3" (UID: "c637caea-745d-4de8-9111-addf155f30c3"). InnerVolumeSpecName "kube-api-access-wxwd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.909465 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c637caea-745d-4de8-9111-addf155f30c3" (UID: "c637caea-745d-4de8-9111-addf155f30c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.928247 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-config-data" (OuterVolumeSpecName: "config-data") pod "c637caea-745d-4de8-9111-addf155f30c3" (UID: "c637caea-745d-4de8-9111-addf155f30c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.977875 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxwd7\" (UniqueName: \"kubernetes.io/projected/c637caea-745d-4de8-9111-addf155f30c3-kube-api-access-wxwd7\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.977916 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.977934 4626 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[4626]: I0223 09:01:05.977949 4626 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c637caea-745d-4de8-9111-addf155f30c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:06 crc kubenswrapper[4626]: I0223 09:01:06.422081 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-vb4gm" event={"ID":"c637caea-745d-4de8-9111-addf155f30c3","Type":"ContainerDied","Data":"96ac6910ce35f32ae1c0f9c36e1abfdd0a3c7e2c6aee0bd68afc1adac7f6748b"} Feb 23 09:01:06 crc kubenswrapper[4626]: I0223 09:01:06.422710 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ac6910ce35f32ae1c0f9c36e1abfdd0a3c7e2c6aee0bd68afc1adac7f6748b" Feb 23 09:01:06 crc kubenswrapper[4626]: I0223 09:01:06.422174 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-vb4gm" Feb 23 09:03:25 crc kubenswrapper[4626]: I0223 09:03:25.685111 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:03:25 crc kubenswrapper[4626]: I0223 09:03:25.685744 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:03:55 crc kubenswrapper[4626]: I0223 09:03:55.685699 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:03:55 crc kubenswrapper[4626]: I0223 09:03:55.686427 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.045240 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2l4"] Feb 23 09:04:10 crc kubenswrapper[4626]: E0223 09:04:10.046339 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c637caea-745d-4de8-9111-addf155f30c3" containerName="keystone-cron" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.046356 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c637caea-745d-4de8-9111-addf155f30c3" containerName="keystone-cron" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.046612 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c637caea-745d-4de8-9111-addf155f30c3" containerName="keystone-cron" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.048308 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.065737 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2l4"] Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.122184 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4qt\" (UniqueName: \"kubernetes.io/projected/20afec2b-406f-4b3a-8e37-23634c21325c-kube-api-access-mq4qt\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.122669 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-utilities\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.122840 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-catalog-content\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.225468 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-catalog-content\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.225798 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4qt\" (UniqueName: \"kubernetes.io/projected/20afec2b-406f-4b3a-8e37-23634c21325c-kube-api-access-mq4qt\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.225942 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-utilities\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.226330 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-catalog-content\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.226388 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-utilities\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.246038 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4qt\" (UniqueName: \"kubernetes.io/projected/20afec2b-406f-4b3a-8e37-23634c21325c-kube-api-access-mq4qt\") pod \"redhat-marketplace-hw2l4\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.369616 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:10 crc kubenswrapper[4626]: I0223 09:04:10.873866 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2l4"] Feb 23 09:04:11 crc kubenswrapper[4626]: I0223 09:04:11.024195 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerStarted","Data":"d42d35e90e3d525f353bd396dbb679e28309aee7dbd341a6e47905c194d9b02a"} Feb 23 09:04:12 crc kubenswrapper[4626]: I0223 09:04:12.032889 4626 generic.go:334] "Generic (PLEG): container finished" podID="20afec2b-406f-4b3a-8e37-23634c21325c" containerID="30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10" exitCode=0 Feb 23 09:04:12 crc kubenswrapper[4626]: I0223 09:04:12.032991 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerDied","Data":"30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10"} Feb 23 09:04:13 crc kubenswrapper[4626]: I0223 09:04:13.043837 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerStarted","Data":"bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948"} Feb 23 09:04:14 crc kubenswrapper[4626]: I0223 09:04:14.052294 4626 generic.go:334] "Generic (PLEG): container finished" podID="20afec2b-406f-4b3a-8e37-23634c21325c" containerID="bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948" exitCode=0 Feb 23 09:04:14 crc kubenswrapper[4626]: I0223 09:04:14.052388 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerDied","Data":"bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948"} Feb 23 09:04:15 crc kubenswrapper[4626]: I0223 09:04:15.060771 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerStarted","Data":"dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd"} Feb 23 09:04:15 crc kubenswrapper[4626]: I0223 09:04:15.081485 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hw2l4" podStartSLOduration=2.592451222 podStartE2EDuration="5.081464784s" podCreationTimestamp="2026-02-23 09:04:10 +0000 UTC" firstStartedPulling="2026-02-23 09:04:12.035370561 +0000 UTC m=+8604.374699827" lastFinishedPulling="2026-02-23 09:04:14.524384122 +0000 UTC m=+8606.863713389" observedRunningTime="2026-02-23 09:04:15.073430457 +0000 UTC m=+8607.412759722" watchObservedRunningTime="2026-02-23 09:04:15.081464784 +0000 UTC m=+8607.420794051" Feb 23 09:04:20 crc kubenswrapper[4626]: I0223 09:04:20.370794 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:20 crc kubenswrapper[4626]: I0223 09:04:20.371623 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:20 crc kubenswrapper[4626]: I0223 09:04:20.415595 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:21 crc kubenswrapper[4626]: I0223 09:04:21.168643 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:21 crc kubenswrapper[4626]: I0223 09:04:21.250394 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2l4"] Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.132968 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hw2l4" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="registry-server" containerID="cri-o://dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd" gracePeriod=2 Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.647199 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.756856 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-utilities\") pod \"20afec2b-406f-4b3a-8e37-23634c21325c\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.757234 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-catalog-content\") pod \"20afec2b-406f-4b3a-8e37-23634c21325c\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.757283 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq4qt\" (UniqueName: \"kubernetes.io/projected/20afec2b-406f-4b3a-8e37-23634c21325c-kube-api-access-mq4qt\") pod \"20afec2b-406f-4b3a-8e37-23634c21325c\" (UID: \"20afec2b-406f-4b3a-8e37-23634c21325c\") " Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.757657 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-utilities" (OuterVolumeSpecName: "utilities") pod "20afec2b-406f-4b3a-8e37-23634c21325c" (UID: "20afec2b-406f-4b3a-8e37-23634c21325c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.758081 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.764319 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20afec2b-406f-4b3a-8e37-23634c21325c-kube-api-access-mq4qt" (OuterVolumeSpecName: "kube-api-access-mq4qt") pod "20afec2b-406f-4b3a-8e37-23634c21325c" (UID: "20afec2b-406f-4b3a-8e37-23634c21325c"). InnerVolumeSpecName "kube-api-access-mq4qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.776132 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20afec2b-406f-4b3a-8e37-23634c21325c" (UID: "20afec2b-406f-4b3a-8e37-23634c21325c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.860109 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20afec2b-406f-4b3a-8e37-23634c21325c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:04:23 crc kubenswrapper[4626]: I0223 09:04:23.860148 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq4qt\" (UniqueName: \"kubernetes.io/projected/20afec2b-406f-4b3a-8e37-23634c21325c-kube-api-access-mq4qt\") on node \"crc\" DevicePath \"\"" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.144213 4626 generic.go:334] "Generic (PLEG): container finished" podID="20afec2b-406f-4b3a-8e37-23634c21325c" containerID="dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd" exitCode=0 Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.144283 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2l4" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.144297 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerDied","Data":"dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd"} Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.144347 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2l4" event={"ID":"20afec2b-406f-4b3a-8e37-23634c21325c","Type":"ContainerDied","Data":"d42d35e90e3d525f353bd396dbb679e28309aee7dbd341a6e47905c194d9b02a"} Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.144371 4626 scope.go:117] "RemoveContainer" containerID="dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.174978 4626 scope.go:117] "RemoveContainer" containerID="bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.197406 4626 scope.go:117] "RemoveContainer" containerID="30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.197975 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2l4"] Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.207813 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2l4"] Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.235049 4626 scope.go:117] "RemoveContainer" containerID="dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd" Feb 23 09:04:24 crc kubenswrapper[4626]: E0223 09:04:24.236289 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd\": container with ID starting with dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd not found: ID does not exist" containerID="dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.236416 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd"} err="failed to get container status \"dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd\": rpc error: code = NotFound desc = could not find container \"dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd\": container with ID starting with dfede37662fee3767ed7260663831ed9570cce88e09129ea84e5ae6d4f78c7dd not found: ID does not exist" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.236556 4626 scope.go:117] "RemoveContainer" containerID="bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948" Feb 23 09:04:24 crc kubenswrapper[4626]: E0223 09:04:24.237046 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948\": container with ID starting with bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948 not found: ID does not exist" containerID="bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.237086 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948"} err="failed to get container status \"bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948\": rpc error: code = NotFound desc = could not find container \"bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948\": container with ID starting with bb9269075d41e0fbf4f5e796a2bb582292a6d33588aea458909934445108a948 not found: ID does not exist" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.237114 4626 scope.go:117] "RemoveContainer" containerID="30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10" Feb 23 09:04:24 crc kubenswrapper[4626]: E0223 09:04:24.237836 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10\": container with ID starting with 30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10 not found: ID does not exist" containerID="30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10" Feb 23 09:04:24 crc kubenswrapper[4626]: I0223 09:04:24.237936 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10"} err="failed to get container status \"30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10\": rpc error: code = NotFound desc = could not find container \"30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10\": container with ID starting with 30f57ee9b787ebad7b08c685b77e7c382b7b4e9e1d1b056cc28c0206b935db10 not found: ID does not exist" Feb 23 09:04:25 crc kubenswrapper[4626]: I0223 09:04:25.684930 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:04:25 crc kubenswrapper[4626]: I0223 09:04:25.685010 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:04:25 crc kubenswrapper[4626]: I0223 09:04:25.685083 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:04:25 crc kubenswrapper[4626]: I0223 09:04:25.686581 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:04:25 crc kubenswrapper[4626]: I0223 09:04:25.686656 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" gracePeriod=600 Feb 23 09:04:25 crc kubenswrapper[4626]: E0223 09:04:25.809486 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:04:26 crc kubenswrapper[4626]: I0223 09:04:26.001576 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" path="/var/lib/kubelet/pods/20afec2b-406f-4b3a-8e37-23634c21325c/volumes" Feb 23 09:04:26 crc kubenswrapper[4626]: I0223 09:04:26.166616 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" exitCode=0 Feb 23 09:04:26 crc kubenswrapper[4626]: I0223 09:04:26.166657 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c"} Feb 23 09:04:26 crc kubenswrapper[4626]: I0223 09:04:26.166691 4626 scope.go:117] "RemoveContainer" containerID="ef0ef59bd5d386dff1c12a966d729743ec224ce7c0108b3374bbed5f762a9739" Feb 23 09:04:26 crc kubenswrapper[4626]: I0223 09:04:26.167570 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:04:26 crc kubenswrapper[4626]: E0223 09:04:26.168000 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:04:39 crc kubenswrapper[4626]: I0223 09:04:39.981883 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:04:39 crc kubenswrapper[4626]: E0223 09:04:39.982795 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:04:50 crc kubenswrapper[4626]: I0223 09:04:50.981847 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:04:50 crc kubenswrapper[4626]: E0223 09:04:50.982814 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:05:02 crc kubenswrapper[4626]: I0223 09:05:02.981993 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:05:02 crc kubenswrapper[4626]: E0223 09:05:02.982818 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:05:16 crc kubenswrapper[4626]: I0223 09:05:16.982375 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:05:16 crc kubenswrapper[4626]: E0223 09:05:16.983192 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:05:27 crc kubenswrapper[4626]: I0223 09:05:27.987967 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:05:27 crc kubenswrapper[4626]: E0223 09:05:27.988880 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:05:42 crc kubenswrapper[4626]: I0223 09:05:42.981882 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:05:42 crc kubenswrapper[4626]: E0223 09:05:42.983749 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:05:55 crc kubenswrapper[4626]: I0223 09:05:55.981890 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:05:55 crc kubenswrapper[4626]: E0223 09:05:55.982648 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:06:08 crc kubenswrapper[4626]: I0223 09:06:08.982462 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:06:08 crc kubenswrapper[4626]: E0223 09:06:08.983393 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:06:23 crc kubenswrapper[4626]: I0223 09:06:23.982792 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:06:24 crc kubenswrapper[4626]: E0223 09:06:23.999001 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:06:37 crc kubenswrapper[4626]: I0223 09:06:37.982515 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:06:37 crc kubenswrapper[4626]: E0223 09:06:37.983411 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:06:51 crc kubenswrapper[4626]: I0223 09:06:51.982475 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:06:51 crc kubenswrapper[4626]: E0223 09:06:51.983094 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:07:06 crc kubenswrapper[4626]: I0223 09:07:06.982792 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:07:06 crc kubenswrapper[4626]: E0223 09:07:06.983761 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:07:17 crc kubenswrapper[4626]: I0223 09:07:17.986940 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:07:17 crc kubenswrapper[4626]: E0223 09:07:17.989178 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:07:29 crc kubenswrapper[4626]: I0223 09:07:29.982957 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:07:29 crc kubenswrapper[4626]: E0223 09:07:29.983894 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:07:44 crc kubenswrapper[4626]: I0223 09:07:44.982714 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:07:44 crc kubenswrapper[4626]: E0223 09:07:44.984353 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:07:58 crc kubenswrapper[4626]: I0223 09:07:58.982005 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:07:58 crc kubenswrapper[4626]: E0223 09:07:58.983022 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:08:13 crc kubenswrapper[4626]: I0223 09:08:13.983202 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:08:13 crc kubenswrapper[4626]: E0223 09:08:13.984309 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:08:28 crc kubenswrapper[4626]: I0223 09:08:28.983091 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:08:28 crc kubenswrapper[4626]: E0223 09:08:28.983959 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:08:41 crc kubenswrapper[4626]: I0223 09:08:41.983339 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:08:41 crc kubenswrapper[4626]: E0223 09:08:41.984341 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:08:56 crc kubenswrapper[4626]: I0223 09:08:56.982685 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:08:56 crc kubenswrapper[4626]: E0223 09:08:56.983600 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:09:07 crc kubenswrapper[4626]: I0223 09:09:07.985125 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:09:07 crc kubenswrapper[4626]: E0223 09:09:07.985854 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:09:19 crc kubenswrapper[4626]: I0223 09:09:19.982664 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:09:19 crc kubenswrapper[4626]: E0223 09:09:19.983746 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:09:34 crc kubenswrapper[4626]: I0223 09:09:34.982934 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:09:35 crc kubenswrapper[4626]: I0223 09:09:35.944375 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"73991df7fd7bbd7ff23b24d571c5ee94c50d4979173c3baf2e3d489323decae3"} Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.453176 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xc9k9"] Feb 23 09:11:35 crc kubenswrapper[4626]: E0223 09:11:35.454033 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="extract-utilities" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.454047 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="extract-utilities" Feb 23 09:11:35 crc kubenswrapper[4626]: E0223 09:11:35.454088 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="extract-content" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.454094 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="extract-content" Feb 23 09:11:35 crc kubenswrapper[4626]: E0223 09:11:35.454108 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="registry-server" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.454115 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="registry-server" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.454285 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="20afec2b-406f-4b3a-8e37-23634c21325c" containerName="registry-server" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.456098 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.467969 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xc9k9"] Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.523701 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-catalog-content\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.523980 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwsl\" (UniqueName: \"kubernetes.io/projected/998cefa8-31e7-42d5-9bfc-038133100d30-kube-api-access-5pwsl\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.524048 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-utilities\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.625927 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-catalog-content\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.626055 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwsl\" (UniqueName: \"kubernetes.io/projected/998cefa8-31e7-42d5-9bfc-038133100d30-kube-api-access-5pwsl\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.626106 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-utilities\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.626404 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-catalog-content\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.626547 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-utilities\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.645926 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwsl\" (UniqueName: \"kubernetes.io/projected/998cefa8-31e7-42d5-9bfc-038133100d30-kube-api-access-5pwsl\") pod \"redhat-operators-xc9k9\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:35 crc kubenswrapper[4626]: I0223 09:11:35.780738 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:36 crc kubenswrapper[4626]: I0223 09:11:36.462266 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xc9k9"] Feb 23 09:11:37 crc kubenswrapper[4626]: I0223 09:11:37.131306 4626 generic.go:334] "Generic (PLEG): container finished" podID="998cefa8-31e7-42d5-9bfc-038133100d30" containerID="965aef96571d0aab7ea90050b9298c0cecc69f688a3bb0647666a398ecdcd818" exitCode=0 Feb 23 09:11:37 crc kubenswrapper[4626]: I0223 09:11:37.131354 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerDied","Data":"965aef96571d0aab7ea90050b9298c0cecc69f688a3bb0647666a398ecdcd818"} Feb 23 09:11:37 crc kubenswrapper[4626]: I0223 09:11:37.131615 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerStarted","Data":"514c5a8479e21af613c15792d0eaf16957c4cdcf28c6fd3e2b2a221b4760c85f"} Feb 23 09:11:37 crc kubenswrapper[4626]: I0223 09:11:37.135128 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:11:39 crc kubenswrapper[4626]: I0223 09:11:39.157696 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerStarted","Data":"a334fe3ec2225f8740fd07179d81a97de77af2db55a6d8eca3cb221bfd599ea6"} Feb 23 09:11:41 crc kubenswrapper[4626]: I0223 09:11:41.177933 4626 generic.go:334] "Generic (PLEG): container finished" podID="998cefa8-31e7-42d5-9bfc-038133100d30" containerID="a334fe3ec2225f8740fd07179d81a97de77af2db55a6d8eca3cb221bfd599ea6" exitCode=0 Feb 23 09:11:41 crc kubenswrapper[4626]: I0223 09:11:41.177997 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerDied","Data":"a334fe3ec2225f8740fd07179d81a97de77af2db55a6d8eca3cb221bfd599ea6"} Feb 23 09:11:42 crc kubenswrapper[4626]: I0223 09:11:42.190105 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerStarted","Data":"e80265e5fdd52fb70b4cef84f21301da8cedc3461e9a15b02989b4219c7ab078"} Feb 23 09:11:42 crc kubenswrapper[4626]: I0223 09:11:42.214706 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xc9k9" podStartSLOduration=2.64942627 podStartE2EDuration="7.214686973s" podCreationTimestamp="2026-02-23 09:11:35 +0000 UTC" firstStartedPulling="2026-02-23 09:11:37.134229033 +0000 UTC m=+9049.473558299" lastFinishedPulling="2026-02-23 09:11:41.699489746 +0000 UTC m=+9054.038819002" observedRunningTime="2026-02-23 09:11:42.213599443 +0000 UTC m=+9054.552928700" watchObservedRunningTime="2026-02-23 09:11:42.214686973 +0000 UTC m=+9054.554016239" Feb 23 09:11:45 crc kubenswrapper[4626]: I0223 09:11:45.781453 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:45 crc kubenswrapper[4626]: I0223 09:11:45.782282 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:11:46 crc kubenswrapper[4626]: I0223 09:11:46.828154 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xc9k9" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="registry-server" probeResult="failure" output=< Feb 23 09:11:46 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:11:46 crc kubenswrapper[4626]: > Feb 23 09:11:55 crc kubenswrapper[4626]: I0223 09:11:55.685646 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:11:55 crc kubenswrapper[4626]: I0223 09:11:55.686606 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:11:56 crc kubenswrapper[4626]: I0223 09:11:56.821563 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xc9k9" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="registry-server" probeResult="failure" output=< Feb 23 09:11:56 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:11:56 crc kubenswrapper[4626]: > Feb 23 09:12:05 crc kubenswrapper[4626]: I0223 09:12:05.823426 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:12:05 crc kubenswrapper[4626]: I0223 09:12:05.862838 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:12:06 crc kubenswrapper[4626]: I0223 09:12:06.660185 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xc9k9"] Feb 23 09:12:07 crc kubenswrapper[4626]: I0223 09:12:07.441170 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xc9k9" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="registry-server" containerID="cri-o://e80265e5fdd52fb70b4cef84f21301da8cedc3461e9a15b02989b4219c7ab078" gracePeriod=2 Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.451360 4626 generic.go:334] "Generic (PLEG): container finished" podID="998cefa8-31e7-42d5-9bfc-038133100d30" containerID="e80265e5fdd52fb70b4cef84f21301da8cedc3461e9a15b02989b4219c7ab078" exitCode=0 Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.451402 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerDied","Data":"e80265e5fdd52fb70b4cef84f21301da8cedc3461e9a15b02989b4219c7ab078"} Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.576109 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.713996 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pwsl\" (UniqueName: \"kubernetes.io/projected/998cefa8-31e7-42d5-9bfc-038133100d30-kube-api-access-5pwsl\") pod \"998cefa8-31e7-42d5-9bfc-038133100d30\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.714404 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-catalog-content\") pod \"998cefa8-31e7-42d5-9bfc-038133100d30\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.714571 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-utilities\") pod \"998cefa8-31e7-42d5-9bfc-038133100d30\" (UID: \"998cefa8-31e7-42d5-9bfc-038133100d30\") " Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.717397 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-utilities" (OuterVolumeSpecName: "utilities") pod "998cefa8-31e7-42d5-9bfc-038133100d30" (UID: "998cefa8-31e7-42d5-9bfc-038133100d30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.726895 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998cefa8-31e7-42d5-9bfc-038133100d30-kube-api-access-5pwsl" (OuterVolumeSpecName: "kube-api-access-5pwsl") pod "998cefa8-31e7-42d5-9bfc-038133100d30" (UID: "998cefa8-31e7-42d5-9bfc-038133100d30"). InnerVolumeSpecName "kube-api-access-5pwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.818948 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pwsl\" (UniqueName: \"kubernetes.io/projected/998cefa8-31e7-42d5-9bfc-038133100d30-kube-api-access-5pwsl\") on node \"crc\" DevicePath \"\"" Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.818992 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.855087 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "998cefa8-31e7-42d5-9bfc-038133100d30" (UID: "998cefa8-31e7-42d5-9bfc-038133100d30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:12:08 crc kubenswrapper[4626]: I0223 09:12:08.919922 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/998cefa8-31e7-42d5-9bfc-038133100d30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.462900 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xc9k9" event={"ID":"998cefa8-31e7-42d5-9bfc-038133100d30","Type":"ContainerDied","Data":"514c5a8479e21af613c15792d0eaf16957c4cdcf28c6fd3e2b2a221b4760c85f"} Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.462975 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xc9k9" Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.463721 4626 scope.go:117] "RemoveContainer" containerID="e80265e5fdd52fb70b4cef84f21301da8cedc3461e9a15b02989b4219c7ab078" Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.495672 4626 scope.go:117] "RemoveContainer" containerID="a334fe3ec2225f8740fd07179d81a97de77af2db55a6d8eca3cb221bfd599ea6" Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.496734 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xc9k9"] Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.506569 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xc9k9"] Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.515879 4626 scope.go:117] "RemoveContainer" containerID="965aef96571d0aab7ea90050b9298c0cecc69f688a3bb0647666a398ecdcd818" Feb 23 09:12:09 crc kubenswrapper[4626]: I0223 09:12:09.995264 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" path="/var/lib/kubelet/pods/998cefa8-31e7-42d5-9bfc-038133100d30/volumes" Feb 23 09:12:25 crc kubenswrapper[4626]: I0223 09:12:25.685087 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:12:25 crc kubenswrapper[4626]: I0223 09:12:25.685524 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.685658 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.686257 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.686304 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.686809 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73991df7fd7bbd7ff23b24d571c5ee94c50d4979173c3baf2e3d489323decae3"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.686864 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://73991df7fd7bbd7ff23b24d571c5ee94c50d4979173c3baf2e3d489323decae3" gracePeriod=600 Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.870899 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="73991df7fd7bbd7ff23b24d571c5ee94c50d4979173c3baf2e3d489323decae3" exitCode=0 Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.870965 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"73991df7fd7bbd7ff23b24d571c5ee94c50d4979173c3baf2e3d489323decae3"} Feb 23 09:12:55 crc kubenswrapper[4626]: I0223 09:12:55.871013 4626 scope.go:117] "RemoveContainer" containerID="29ea9332c238c28fe7c24b38c0626c1c89df853aec350c03bc3fa0a899a0dc7c" Feb 23 09:12:56 crc kubenswrapper[4626]: I0223 09:12:56.883533 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185"} Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.448549 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rxm2x"] Feb 23 09:12:58 crc kubenswrapper[4626]: E0223 09:12:58.450871 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="registry-server" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.450913 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="registry-server" Feb 23 09:12:58 crc kubenswrapper[4626]: E0223 09:12:58.450973 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="extract-utilities" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.450982 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="extract-utilities" Feb 23 09:12:58 crc kubenswrapper[4626]: E0223 09:12:58.451020 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="extract-content" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.451027 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="extract-content" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.451243 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="998cefa8-31e7-42d5-9bfc-038133100d30" containerName="registry-server" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.454307 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.467110 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxm2x"] Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.527235 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-utilities\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.527425 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8fj9\" (UniqueName: \"kubernetes.io/projected/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-kube-api-access-g8fj9\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.527690 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-catalog-content\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.630761 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-utilities\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.631121 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8fj9\" (UniqueName: \"kubernetes.io/projected/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-kube-api-access-g8fj9\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.631217 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-catalog-content\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.631319 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-utilities\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.631755 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-catalog-content\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.649534 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8fj9\" (UniqueName: \"kubernetes.io/projected/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-kube-api-access-g8fj9\") pod \"certified-operators-rxm2x\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:58 crc kubenswrapper[4626]: I0223 09:12:58.775596 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:12:59 crc kubenswrapper[4626]: I0223 09:12:59.250061 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxm2x"] Feb 23 09:12:59 crc kubenswrapper[4626]: I0223 09:12:59.912063 4626 generic.go:334] "Generic (PLEG): container finished" podID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerID="96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e" exitCode=0 Feb 23 09:12:59 crc kubenswrapper[4626]: I0223 09:12:59.912167 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerDied","Data":"96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e"} Feb 23 09:12:59 crc kubenswrapper[4626]: I0223 09:12:59.912434 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerStarted","Data":"eb35d45f7acd7d74e62af163c268f381f15d33cded5daa3dab68b337d23d6857"} Feb 23 09:13:00 crc kubenswrapper[4626]: I0223 09:13:00.929375 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerStarted","Data":"fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020"} Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.051764 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k676f"] Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.054951 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.066518 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k676f"] Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.187210 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-catalog-content\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.187281 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxhz\" (UniqueName: \"kubernetes.io/projected/53ce0b03-ba8f-4352-a6ef-b7954a385174-kube-api-access-ppxhz\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.187488 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-utilities\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.289587 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxhz\" (UniqueName: \"kubernetes.io/projected/53ce0b03-ba8f-4352-a6ef-b7954a385174-kube-api-access-ppxhz\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.289815 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-utilities\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.289854 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-catalog-content\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.290551 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-catalog-content\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.291653 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-utilities\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.312808 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxhz\" (UniqueName: \"kubernetes.io/projected/53ce0b03-ba8f-4352-a6ef-b7954a385174-kube-api-access-ppxhz\") pod \"community-operators-k676f\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.379243 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.886848 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k676f"] Feb 23 09:13:01 crc kubenswrapper[4626]: I0223 09:13:01.961253 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerStarted","Data":"11cda271c92c0d57d683db44092820f8b2d49fe0febec530c6534f8d213ea3e2"} Feb 23 09:13:02 crc kubenswrapper[4626]: I0223 09:13:02.979854 4626 generic.go:334] "Generic (PLEG): container finished" podID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerID="5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b" exitCode=0 Feb 23 09:13:02 crc kubenswrapper[4626]: I0223 09:13:02.980368 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerDied","Data":"5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b"} Feb 23 09:13:02 crc kubenswrapper[4626]: I0223 09:13:02.987055 4626 generic.go:334] "Generic (PLEG): container finished" podID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerID="fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020" exitCode=0 Feb 23 09:13:02 crc kubenswrapper[4626]: I0223 09:13:02.987101 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerDied","Data":"fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020"} Feb 23 09:13:03 crc kubenswrapper[4626]: I0223 09:13:03.999310 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerStarted","Data":"db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705"} Feb 23 09:13:04 crc kubenswrapper[4626]: I0223 09:13:04.002069 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerStarted","Data":"0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539"} Feb 23 09:13:05 crc kubenswrapper[4626]: I0223 09:13:05.015242 4626 generic.go:334] "Generic (PLEG): container finished" podID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerID="db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705" exitCode=0 Feb 23 09:13:05 crc kubenswrapper[4626]: I0223 09:13:05.015293 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerDied","Data":"db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705"} Feb 23 09:13:05 crc kubenswrapper[4626]: I0223 09:13:05.044184 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rxm2x" podStartSLOduration=3.510094054 podStartE2EDuration="7.044164201s" podCreationTimestamp="2026-02-23 09:12:58 +0000 UTC" firstStartedPulling="2026-02-23 09:12:59.914563093 +0000 UTC m=+9132.253892359" lastFinishedPulling="2026-02-23 09:13:03.44863324 +0000 UTC m=+9135.787962506" observedRunningTime="2026-02-23 09:13:04.038532485 +0000 UTC m=+9136.377861751" watchObservedRunningTime="2026-02-23 09:13:05.044164201 +0000 UTC m=+9137.383493466" Feb 23 09:13:06 crc kubenswrapper[4626]: I0223 09:13:06.028395 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerStarted","Data":"a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7"} Feb 23 09:13:06 crc kubenswrapper[4626]: I0223 09:13:06.049549 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k676f" podStartSLOduration=2.533382472 podStartE2EDuration="5.049523211s" podCreationTimestamp="2026-02-23 09:13:01 +0000 UTC" firstStartedPulling="2026-02-23 09:13:02.983029691 +0000 UTC m=+9135.322358957" lastFinishedPulling="2026-02-23 09:13:05.499170431 +0000 UTC m=+9137.838499696" observedRunningTime="2026-02-23 09:13:06.045141881 +0000 UTC m=+9138.384471147" watchObservedRunningTime="2026-02-23 09:13:06.049523211 +0000 UTC m=+9138.388852477" Feb 23 09:13:08 crc kubenswrapper[4626]: I0223 09:13:08.775937 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:13:08 crc kubenswrapper[4626]: I0223 09:13:08.776681 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:13:09 crc kubenswrapper[4626]: I0223 09:13:09.828203 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rxm2x" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="registry-server" probeResult="failure" output=< Feb 23 09:13:09 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:13:09 crc kubenswrapper[4626]: > Feb 23 09:13:11 crc kubenswrapper[4626]: I0223 09:13:11.380045 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:11 crc kubenswrapper[4626]: I0223 09:13:11.380429 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:11 crc kubenswrapper[4626]: I0223 09:13:11.418148 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:12 crc kubenswrapper[4626]: I0223 09:13:12.126398 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:12 crc kubenswrapper[4626]: I0223 09:13:12.183013 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k676f"] Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.106048 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k676f" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="registry-server" containerID="cri-o://a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7" gracePeriod=2 Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.555148 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.629424 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-catalog-content\") pod \"53ce0b03-ba8f-4352-a6ef-b7954a385174\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.629597 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-utilities\") pod \"53ce0b03-ba8f-4352-a6ef-b7954a385174\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.629664 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxhz\" (UniqueName: \"kubernetes.io/projected/53ce0b03-ba8f-4352-a6ef-b7954a385174-kube-api-access-ppxhz\") pod \"53ce0b03-ba8f-4352-a6ef-b7954a385174\" (UID: \"53ce0b03-ba8f-4352-a6ef-b7954a385174\") " Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.630234 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-utilities" (OuterVolumeSpecName: "utilities") pod "53ce0b03-ba8f-4352-a6ef-b7954a385174" (UID: "53ce0b03-ba8f-4352-a6ef-b7954a385174"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.630772 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.643724 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ce0b03-ba8f-4352-a6ef-b7954a385174-kube-api-access-ppxhz" (OuterVolumeSpecName: "kube-api-access-ppxhz") pod "53ce0b03-ba8f-4352-a6ef-b7954a385174" (UID: "53ce0b03-ba8f-4352-a6ef-b7954a385174"). InnerVolumeSpecName "kube-api-access-ppxhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.674253 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53ce0b03-ba8f-4352-a6ef-b7954a385174" (UID: "53ce0b03-ba8f-4352-a6ef-b7954a385174"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.732264 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53ce0b03-ba8f-4352-a6ef-b7954a385174-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:14 crc kubenswrapper[4626]: I0223 09:13:14.732305 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxhz\" (UniqueName: \"kubernetes.io/projected/53ce0b03-ba8f-4352-a6ef-b7954a385174-kube-api-access-ppxhz\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.120226 4626 generic.go:334] "Generic (PLEG): container finished" podID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerID="a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7" exitCode=0 Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.120409 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerDied","Data":"a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7"} Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.120593 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k676f" event={"ID":"53ce0b03-ba8f-4352-a6ef-b7954a385174","Type":"ContainerDied","Data":"11cda271c92c0d57d683db44092820f8b2d49fe0febec530c6534f8d213ea3e2"} Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.120629 4626 scope.go:117] "RemoveContainer" containerID="a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.121376 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k676f" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.151671 4626 scope.go:117] "RemoveContainer" containerID="db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.164584 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k676f"] Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.169037 4626 scope.go:117] "RemoveContainer" containerID="5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.170471 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k676f"] Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.204476 4626 scope.go:117] "RemoveContainer" containerID="a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7" Feb 23 09:13:15 crc kubenswrapper[4626]: E0223 09:13:15.206126 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7\": container with ID starting with a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7 not found: ID does not exist" containerID="a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.206194 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7"} err="failed to get container status \"a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7\": rpc error: code = NotFound desc = could not find container \"a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7\": container with ID starting with a66bf44d1f17352a26b8683fcbcb98f3d8f42232b947b25f9d426c9218aba3e7 not found: ID does not exist" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.206214 4626 scope.go:117] "RemoveContainer" containerID="db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705" Feb 23 09:13:15 crc kubenswrapper[4626]: E0223 09:13:15.206547 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705\": container with ID starting with db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705 not found: ID does not exist" containerID="db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.206567 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705"} err="failed to get container status \"db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705\": rpc error: code = NotFound desc = could not find container \"db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705\": container with ID starting with db917f9b092f681ac50a008877d002a2f7b86797e352905b8f275b15dd180705 not found: ID does not exist" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.206608 4626 scope.go:117] "RemoveContainer" containerID="5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b" Feb 23 09:13:15 crc kubenswrapper[4626]: E0223 09:13:15.206909 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b\": container with ID starting with 5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b not found: ID does not exist" containerID="5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.206932 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b"} err="failed to get container status \"5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b\": rpc error: code = NotFound desc = could not find container \"5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b\": container with ID starting with 5453d49f81f87230b3f3282ade26fe8eba2d598c5147ae650867b4f94092988b not found: ID does not exist" Feb 23 09:13:15 crc kubenswrapper[4626]: I0223 09:13:15.992575 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" path="/var/lib/kubelet/pods/53ce0b03-ba8f-4352-a6ef-b7954a385174/volumes" Feb 23 09:13:18 crc kubenswrapper[4626]: I0223 09:13:18.918366 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:13:18 crc kubenswrapper[4626]: I0223 09:13:18.964617 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:13:19 crc kubenswrapper[4626]: I0223 09:13:19.167883 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rxm2x"] Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.176713 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rxm2x" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="registry-server" containerID="cri-o://0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539" gracePeriod=2 Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.641180 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.767343 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-catalog-content\") pod \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.767398 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8fj9\" (UniqueName: \"kubernetes.io/projected/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-kube-api-access-g8fj9\") pod \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.767574 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-utilities\") pod \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\" (UID: \"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec\") " Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.768148 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-utilities" (OuterVolumeSpecName: "utilities") pod "fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" (UID: "fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.774473 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-kube-api-access-g8fj9" (OuterVolumeSpecName: "kube-api-access-g8fj9") pod "fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" (UID: "fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec"). InnerVolumeSpecName "kube-api-access-g8fj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.809116 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" (UID: "fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.869924 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.869962 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8fj9\" (UniqueName: \"kubernetes.io/projected/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-kube-api-access-g8fj9\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:20 crc kubenswrapper[4626]: I0223 09:13:20.869974 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.189410 4626 generic.go:334] "Generic (PLEG): container finished" podID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerID="0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539" exitCode=0 Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.189551 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxm2x" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.189486 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerDied","Data":"0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539"} Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.189823 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxm2x" event={"ID":"fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec","Type":"ContainerDied","Data":"eb35d45f7acd7d74e62af163c268f381f15d33cded5daa3dab68b337d23d6857"} Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.189849 4626 scope.go:117] "RemoveContainer" containerID="0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.225854 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rxm2x"] Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.234199 4626 scope.go:117] "RemoveContainer" containerID="fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.235511 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rxm2x"] Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.265290 4626 scope.go:117] "RemoveContainer" containerID="96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.298160 4626 scope.go:117] "RemoveContainer" containerID="0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539" Feb 23 09:13:21 crc kubenswrapper[4626]: E0223 09:13:21.298530 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539\": container with ID starting with 0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539 not found: ID does not exist" containerID="0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.298567 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539"} err="failed to get container status \"0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539\": rpc error: code = NotFound desc = could not find container \"0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539\": container with ID starting with 0438492bcb5f41b8e0258840fc7cfda865196d5d612133f5e7288df823754539 not found: ID does not exist" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.298595 4626 scope.go:117] "RemoveContainer" containerID="fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020" Feb 23 09:13:21 crc kubenswrapper[4626]: E0223 09:13:21.299001 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020\": container with ID starting with fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020 not found: ID does not exist" containerID="fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.299024 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020"} err="failed to get container status \"fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020\": rpc error: code = NotFound desc = could not find container \"fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020\": container with ID starting with fc26245b361922d3c6f1e9b5021ce358a9e726f592fa3664283b9e8140636020 not found: ID does not exist" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.299038 4626 scope.go:117] "RemoveContainer" containerID="96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e" Feb 23 09:13:21 crc kubenswrapper[4626]: E0223 09:13:21.299242 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e\": container with ID starting with 96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e not found: ID does not exist" containerID="96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.299267 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e"} err="failed to get container status \"96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e\": rpc error: code = NotFound desc = could not find container \"96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e\": container with ID starting with 96c89571ee19b15f6f4d298c5817b5e8574354075a1960aad390bd269eb86b0e not found: ID does not exist" Feb 23 09:13:21 crc kubenswrapper[4626]: I0223 09:13:21.992043 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" path="/var/lib/kubelet/pods/fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec/volumes" Feb 23 09:14:55 crc kubenswrapper[4626]: I0223 09:14:55.685530 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:14:55 crc kubenswrapper[4626]: I0223 09:14:55.686114 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.292342 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg"] Feb 23 09:15:00 crc kubenswrapper[4626]: E0223 09:15:00.293435 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="registry-server" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293449 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="registry-server" Feb 23 09:15:00 crc kubenswrapper[4626]: E0223 09:15:00.293460 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="extract-utilities" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293467 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="extract-utilities" Feb 23 09:15:00 crc kubenswrapper[4626]: E0223 09:15:00.293486 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="registry-server" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293492 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="registry-server" Feb 23 09:15:00 crc kubenswrapper[4626]: E0223 09:15:00.293516 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="extract-utilities" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293523 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="extract-utilities" Feb 23 09:15:00 crc kubenswrapper[4626]: E0223 09:15:00.293548 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="extract-content" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293554 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="extract-content" Feb 23 09:15:00 crc kubenswrapper[4626]: E0223 09:15:00.293573 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="extract-content" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293579 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="extract-content" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293769 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ce0b03-ba8f-4352-a6ef-b7954a385174" containerName="registry-server" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.293788 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5b7f44-5aa7-4ca6-b894-92c7bdcbaaec" containerName="registry-server" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.294608 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.300298 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg"] Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.306022 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.314828 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.356224 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgwps\" (UniqueName: \"kubernetes.io/projected/efee1127-9770-496c-9bad-69d67a149b53-kube-api-access-zgwps\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.356302 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efee1127-9770-496c-9bad-69d67a149b53-config-volume\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.356378 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efee1127-9770-496c-9bad-69d67a149b53-secret-volume\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.458253 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efee1127-9770-496c-9bad-69d67a149b53-config-volume\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.458581 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efee1127-9770-496c-9bad-69d67a149b53-secret-volume\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.458839 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgwps\" (UniqueName: \"kubernetes.io/projected/efee1127-9770-496c-9bad-69d67a149b53-kube-api-access-zgwps\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.459198 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efee1127-9770-496c-9bad-69d67a149b53-config-volume\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.470418 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efee1127-9770-496c-9bad-69d67a149b53-secret-volume\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.475820 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgwps\" (UniqueName: \"kubernetes.io/projected/efee1127-9770-496c-9bad-69d67a149b53-kube-api-access-zgwps\") pod \"collect-profiles-29530635-kr4bg\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:00 crc kubenswrapper[4626]: I0223 09:15:00.626135 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:01 crc kubenswrapper[4626]: I0223 09:15:01.624840 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg"] Feb 23 09:15:02 crc kubenswrapper[4626]: I0223 09:15:02.064810 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" event={"ID":"efee1127-9770-496c-9bad-69d67a149b53","Type":"ContainerStarted","Data":"002f5f16c4f8d160c453229c154379948186f3cabefe6646e743dac9e607d831"} Feb 23 09:15:02 crc kubenswrapper[4626]: I0223 09:15:02.065307 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" event={"ID":"efee1127-9770-496c-9bad-69d67a149b53","Type":"ContainerStarted","Data":"fc308bc8df182b0da886eb4dcec5863bf9e5887552f960f5601bbf7e238b1bc2"} Feb 23 09:15:02 crc kubenswrapper[4626]: I0223 09:15:02.110694 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" podStartSLOduration=2.110672794 podStartE2EDuration="2.110672794s" podCreationTimestamp="2026-02-23 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:15:02.109653603 +0000 UTC m=+9254.448982869" watchObservedRunningTime="2026-02-23 09:15:02.110672794 +0000 UTC m=+9254.450002060" Feb 23 09:15:03 crc kubenswrapper[4626]: I0223 09:15:03.076261 4626 generic.go:334] "Generic (PLEG): container finished" podID="efee1127-9770-496c-9bad-69d67a149b53" containerID="002f5f16c4f8d160c453229c154379948186f3cabefe6646e743dac9e607d831" exitCode=0 Feb 23 09:15:03 crc kubenswrapper[4626]: I0223 09:15:03.076315 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" event={"ID":"efee1127-9770-496c-9bad-69d67a149b53","Type":"ContainerDied","Data":"002f5f16c4f8d160c453229c154379948186f3cabefe6646e743dac9e607d831"} Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.170037 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xxtlh"] Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.172789 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.191414 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxtlh"] Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.268050 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-utilities\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.268097 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-catalog-content\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.268190 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrlhd\" (UniqueName: \"kubernetes.io/projected/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-kube-api-access-hrlhd\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.369726 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrlhd\" (UniqueName: \"kubernetes.io/projected/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-kube-api-access-hrlhd\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.370098 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-utilities\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.370139 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-catalog-content\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.370657 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-utilities\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.370761 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-catalog-content\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.395155 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrlhd\" (UniqueName: \"kubernetes.io/projected/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-kube-api-access-hrlhd\") pod \"redhat-marketplace-xxtlh\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.494374 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.608280 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.682710 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgwps\" (UniqueName: \"kubernetes.io/projected/efee1127-9770-496c-9bad-69d67a149b53-kube-api-access-zgwps\") pod \"efee1127-9770-496c-9bad-69d67a149b53\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.683081 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efee1127-9770-496c-9bad-69d67a149b53-secret-volume\") pod \"efee1127-9770-496c-9bad-69d67a149b53\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.683279 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efee1127-9770-496c-9bad-69d67a149b53-config-volume\") pod \"efee1127-9770-496c-9bad-69d67a149b53\" (UID: \"efee1127-9770-496c-9bad-69d67a149b53\") " Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.684649 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efee1127-9770-496c-9bad-69d67a149b53-config-volume" (OuterVolumeSpecName: "config-volume") pod "efee1127-9770-496c-9bad-69d67a149b53" (UID: "efee1127-9770-496c-9bad-69d67a149b53"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.689981 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efee1127-9770-496c-9bad-69d67a149b53-kube-api-access-zgwps" (OuterVolumeSpecName: "kube-api-access-zgwps") pod "efee1127-9770-496c-9bad-69d67a149b53" (UID: "efee1127-9770-496c-9bad-69d67a149b53"). InnerVolumeSpecName "kube-api-access-zgwps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.713297 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efee1127-9770-496c-9bad-69d67a149b53-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "efee1127-9770-496c-9bad-69d67a149b53" (UID: "efee1127-9770-496c-9bad-69d67a149b53"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.720786 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv"] Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.729814 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-qk2xv"] Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.786527 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efee1127-9770-496c-9bad-69d67a149b53-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.786559 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efee1127-9770-496c-9bad-69d67a149b53-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:04 crc kubenswrapper[4626]: I0223 09:15:04.786572 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgwps\" (UniqueName: \"kubernetes.io/projected/efee1127-9770-496c-9bad-69d67a149b53-kube-api-access-zgwps\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:05 crc kubenswrapper[4626]: I0223 09:15:05.038187 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxtlh"] Feb 23 09:15:05 crc kubenswrapper[4626]: W0223 09:15:05.039068 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c683c3_0d2b_430d_b3f7_07ed0a8beb4c.slice/crio-7f35b9b96d45f1c575fc6edae1eb90c132fb17e259de651083d08217e36b1849 WatchSource:0}: Error finding container 7f35b9b96d45f1c575fc6edae1eb90c132fb17e259de651083d08217e36b1849: Status 404 returned error can't find the container with id 7f35b9b96d45f1c575fc6edae1eb90c132fb17e259de651083d08217e36b1849 Feb 23 09:15:05 crc kubenswrapper[4626]: I0223 09:15:05.107594 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" event={"ID":"efee1127-9770-496c-9bad-69d67a149b53","Type":"ContainerDied","Data":"fc308bc8df182b0da886eb4dcec5863bf9e5887552f960f5601bbf7e238b1bc2"} Feb 23 09:15:05 crc kubenswrapper[4626]: I0223 09:15:05.107724 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-kr4bg" Feb 23 09:15:05 crc kubenswrapper[4626]: I0223 09:15:05.107646 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc308bc8df182b0da886eb4dcec5863bf9e5887552f960f5601bbf7e238b1bc2" Feb 23 09:15:05 crc kubenswrapper[4626]: I0223 09:15:05.109565 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerStarted","Data":"7f35b9b96d45f1c575fc6edae1eb90c132fb17e259de651083d08217e36b1849"} Feb 23 09:15:05 crc kubenswrapper[4626]: I0223 09:15:05.993386 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865947e0-9c17-4817-89a3-2257f5683244" path="/var/lib/kubelet/pods/865947e0-9c17-4817-89a3-2257f5683244/volumes" Feb 23 09:15:06 crc kubenswrapper[4626]: I0223 09:15:06.122522 4626 generic.go:334] "Generic (PLEG): container finished" podID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerID="bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6" exitCode=0 Feb 23 09:15:06 crc kubenswrapper[4626]: I0223 09:15:06.122579 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerDied","Data":"bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6"} Feb 23 09:15:07 crc kubenswrapper[4626]: I0223 09:15:07.136272 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerStarted","Data":"44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d"} Feb 23 09:15:08 crc kubenswrapper[4626]: I0223 09:15:08.148126 4626 generic.go:334] "Generic (PLEG): container finished" podID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerID="44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d" exitCode=0 Feb 23 09:15:08 crc kubenswrapper[4626]: I0223 09:15:08.148252 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerDied","Data":"44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d"} Feb 23 09:15:09 crc kubenswrapper[4626]: I0223 09:15:09.161985 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerStarted","Data":"042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d"} Feb 23 09:15:09 crc kubenswrapper[4626]: I0223 09:15:09.188682 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xxtlh" podStartSLOduration=2.6544166479999998 podStartE2EDuration="5.18866279s" podCreationTimestamp="2026-02-23 09:15:04 +0000 UTC" firstStartedPulling="2026-02-23 09:15:06.126554518 +0000 UTC m=+9258.465883784" lastFinishedPulling="2026-02-23 09:15:08.660800659 +0000 UTC m=+9261.000129926" observedRunningTime="2026-02-23 09:15:09.178995405 +0000 UTC m=+9261.518324670" watchObservedRunningTime="2026-02-23 09:15:09.18866279 +0000 UTC m=+9261.527992055" Feb 23 09:15:14 crc kubenswrapper[4626]: I0223 09:15:14.494599 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:14 crc kubenswrapper[4626]: I0223 09:15:14.495283 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:14 crc kubenswrapper[4626]: I0223 09:15:14.536452 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:15 crc kubenswrapper[4626]: I0223 09:15:15.523311 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:15 crc kubenswrapper[4626]: I0223 09:15:15.571906 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxtlh"] Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.232550 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xxtlh" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="registry-server" containerID="cri-o://042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d" gracePeriod=2 Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.687475 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.768309 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-catalog-content\") pod \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.768359 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-utilities\") pod \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.769153 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-utilities" (OuterVolumeSpecName: "utilities") pod "e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" (UID: "e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.791316 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" (UID: "e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.869263 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrlhd\" (UniqueName: \"kubernetes.io/projected/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-kube-api-access-hrlhd\") pod \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\" (UID: \"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c\") " Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.869909 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.869932 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.877628 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-kube-api-access-hrlhd" (OuterVolumeSpecName: "kube-api-access-hrlhd") pod "e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" (UID: "e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c"). InnerVolumeSpecName "kube-api-access-hrlhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:17 crc kubenswrapper[4626]: I0223 09:15:17.972309 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrlhd\" (UniqueName: \"kubernetes.io/projected/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c-kube-api-access-hrlhd\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.246811 4626 generic.go:334] "Generic (PLEG): container finished" podID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerID="042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d" exitCode=0 Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.246905 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerDied","Data":"042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d"} Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.246963 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxtlh" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.247005 4626 scope.go:117] "RemoveContainer" containerID="042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.246988 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxtlh" event={"ID":"e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c","Type":"ContainerDied","Data":"7f35b9b96d45f1c575fc6edae1eb90c132fb17e259de651083d08217e36b1849"} Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.281598 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxtlh"] Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.291346 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxtlh"] Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.298559 4626 scope.go:117] "RemoveContainer" containerID="44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.317318 4626 scope.go:117] "RemoveContainer" containerID="bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.350365 4626 scope.go:117] "RemoveContainer" containerID="042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d" Feb 23 09:15:18 crc kubenswrapper[4626]: E0223 09:15:18.351038 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d\": container with ID starting with 042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d not found: ID does not exist" containerID="042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.351076 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d"} err="failed to get container status \"042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d\": rpc error: code = NotFound desc = could not find container \"042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d\": container with ID starting with 042570a9f4cb6af7cbdd8fb2d55ac0daa037499f59fc647f456838f7815c637d not found: ID does not exist" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.351100 4626 scope.go:117] "RemoveContainer" containerID="44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d" Feb 23 09:15:18 crc kubenswrapper[4626]: E0223 09:15:18.351515 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d\": container with ID starting with 44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d not found: ID does not exist" containerID="44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.351553 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d"} err="failed to get container status \"44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d\": rpc error: code = NotFound desc = could not find container \"44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d\": container with ID starting with 44d3412840b5f88659cfd768777f767f6de2072b37ded3c7de54985cd020427d not found: ID does not exist" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.351580 4626 scope.go:117] "RemoveContainer" containerID="bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6" Feb 23 09:15:18 crc kubenswrapper[4626]: E0223 09:15:18.351919 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6\": container with ID starting with bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6 not found: ID does not exist" containerID="bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6" Feb 23 09:15:18 crc kubenswrapper[4626]: I0223 09:15:18.351956 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6"} err="failed to get container status \"bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6\": rpc error: code = NotFound desc = could not find container \"bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6\": container with ID starting with bbabb4b1d3a15af6de17d91c5be530c6ed2b8eb68785cc194c5a5cbbad0bfac6 not found: ID does not exist" Feb 23 09:15:19 crc kubenswrapper[4626]: I0223 09:15:19.991648 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" path="/var/lib/kubelet/pods/e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c/volumes" Feb 23 09:15:25 crc kubenswrapper[4626]: I0223 09:15:25.066351 4626 scope.go:117] "RemoveContainer" containerID="7dd92e3ddba97b0b3f2a161ca44b7039dc7848d8d778311ef09013e9b2781a58" Feb 23 09:15:25 crc kubenswrapper[4626]: I0223 09:15:25.685187 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:15:25 crc kubenswrapper[4626]: I0223 09:15:25.685479 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:15:55 crc kubenswrapper[4626]: I0223 09:15:55.685461 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:15:55 crc kubenswrapper[4626]: I0223 09:15:55.686157 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:15:55 crc kubenswrapper[4626]: I0223 09:15:55.686215 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:15:55 crc kubenswrapper[4626]: I0223 09:15:55.686912 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:15:55 crc kubenswrapper[4626]: I0223 09:15:55.686984 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" gracePeriod=600 Feb 23 09:15:55 crc kubenswrapper[4626]: E0223 09:15:55.813122 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:15:56 crc kubenswrapper[4626]: I0223 09:15:56.621901 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" exitCode=0 Feb 23 09:15:56 crc kubenswrapper[4626]: I0223 09:15:56.621954 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185"} Feb 23 09:15:56 crc kubenswrapper[4626]: I0223 09:15:56.622008 4626 scope.go:117] "RemoveContainer" containerID="73991df7fd7bbd7ff23b24d571c5ee94c50d4979173c3baf2e3d489323decae3" Feb 23 09:15:56 crc kubenswrapper[4626]: I0223 09:15:56.622986 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:15:56 crc kubenswrapper[4626]: E0223 09:15:56.623536 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:16:11 crc kubenswrapper[4626]: I0223 09:16:11.983100 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:16:11 crc kubenswrapper[4626]: E0223 09:16:11.985236 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:16:25 crc kubenswrapper[4626]: I0223 09:16:25.982029 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:16:25 crc kubenswrapper[4626]: E0223 09:16:25.982873 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:16:37 crc kubenswrapper[4626]: I0223 09:16:37.988179 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:16:37 crc kubenswrapper[4626]: E0223 09:16:37.988953 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:16:48 crc kubenswrapper[4626]: I0223 09:16:48.981612 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:16:48 crc kubenswrapper[4626]: E0223 09:16:48.982292 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:17:02 crc kubenswrapper[4626]: I0223 09:17:02.983021 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:17:02 crc kubenswrapper[4626]: E0223 09:17:02.984008 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:17:16 crc kubenswrapper[4626]: I0223 09:17:16.982033 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:17:16 crc kubenswrapper[4626]: E0223 09:17:16.983004 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:17:31 crc kubenswrapper[4626]: I0223 09:17:31.982343 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:17:31 crc kubenswrapper[4626]: E0223 09:17:31.983364 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:17:46 crc kubenswrapper[4626]: I0223 09:17:46.982730 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:17:46 crc kubenswrapper[4626]: E0223 09:17:46.984678 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:17:59 crc kubenswrapper[4626]: I0223 09:17:59.991872 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:17:59 crc kubenswrapper[4626]: E0223 09:17:59.994604 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:18:13 crc kubenswrapper[4626]: I0223 09:18:13.984040 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:18:13 crc kubenswrapper[4626]: E0223 09:18:13.985003 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:18:24 crc kubenswrapper[4626]: I0223 09:18:24.982208 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:18:24 crc kubenswrapper[4626]: E0223 09:18:24.983325 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:18:38 crc kubenswrapper[4626]: I0223 09:18:38.983247 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:18:38 crc kubenswrapper[4626]: E0223 09:18:38.984156 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:18:53 crc kubenswrapper[4626]: I0223 09:18:53.982087 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:18:53 crc kubenswrapper[4626]: E0223 09:18:53.983129 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:19:04 crc kubenswrapper[4626]: I0223 09:19:04.983477 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:19:04 crc kubenswrapper[4626]: E0223 09:19:04.984570 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:19:18 crc kubenswrapper[4626]: I0223 09:19:18.983380 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:19:18 crc kubenswrapper[4626]: E0223 09:19:18.985690 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:19:29 crc kubenswrapper[4626]: I0223 09:19:29.983673 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:19:29 crc kubenswrapper[4626]: E0223 09:19:29.984488 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:19:40 crc kubenswrapper[4626]: I0223 09:19:40.983711 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:19:40 crc kubenswrapper[4626]: E0223 09:19:40.984686 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:19:53 crc kubenswrapper[4626]: I0223 09:19:53.983000 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:19:53 crc kubenswrapper[4626]: E0223 09:19:53.983952 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:20:08 crc kubenswrapper[4626]: I0223 09:20:08.012803 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:20:08 crc kubenswrapper[4626]: E0223 09:20:08.014240 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:20:20 crc kubenswrapper[4626]: I0223 09:20:20.983144 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:20:20 crc kubenswrapper[4626]: E0223 09:20:20.984595 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:20:32 crc kubenswrapper[4626]: I0223 09:20:32.982217 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:20:32 crc kubenswrapper[4626]: E0223 09:20:32.983015 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:20:46 crc kubenswrapper[4626]: I0223 09:20:46.982961 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:20:46 crc kubenswrapper[4626]: E0223 09:20:46.984801 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:21:01 crc kubenswrapper[4626]: I0223 09:21:01.982651 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:21:02 crc kubenswrapper[4626]: I0223 09:21:02.494149 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"0e2319dd85ff58f417e9b19da900255d78470cf0b6837778a9207fb2e2656ffd"} Feb 23 09:22:47 crc kubenswrapper[4626]: I0223 09:22:47.556864 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"4774892a-4776-4db1-b74e-d78df11aa97e","Type":"ContainerDied","Data":"7ad08357352f67f88732dd4c1e7f11901676a755054b59f42eb83b2c913b3c20"} Feb 23 09:22:47 crc kubenswrapper[4626]: I0223 09:22:47.557449 4626 generic.go:334] "Generic (PLEG): container finished" podID="4774892a-4776-4db1-b74e-d78df11aa97e" containerID="7ad08357352f67f88732dd4c1e7f11901676a755054b59f42eb83b2c913b3c20" exitCode=0 Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.317137 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354443 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ca-certs\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354508 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-temporary\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354548 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354581 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-workdir\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354666 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config-secret\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354725 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bdkj\" (UniqueName: \"kubernetes.io/projected/4774892a-4776-4db1-b74e-d78df11aa97e-kube-api-access-5bdkj\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354744 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354764 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ssh-key\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.354790 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-config-data\") pod \"4774892a-4776-4db1-b74e-d78df11aa97e\" (UID: \"4774892a-4776-4db1-b74e-d78df11aa97e\") " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.358694 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-config-data" (OuterVolumeSpecName: "config-data") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.362730 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.363726 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.392204 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4774892a-4776-4db1-b74e-d78df11aa97e-kube-api-access-5bdkj" (OuterVolumeSpecName: "kube-api-access-5bdkj") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "kube-api-access-5bdkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.400629 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.422587 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.424923 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.435964 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.449352 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4774892a-4776-4db1-b74e-d78df11aa97e" (UID: "4774892a-4776-4db1-b74e-d78df11aa97e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456706 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456736 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bdkj\" (UniqueName: \"kubernetes.io/projected/4774892a-4776-4db1-b74e-d78df11aa97e-kube-api-access-5bdkj\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456748 4626 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456757 4626 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456765 4626 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4774892a-4776-4db1-b74e-d78df11aa97e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456773 4626 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4774892a-4776-4db1-b74e-d78df11aa97e-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.456783 4626 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.458549 4626 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.458581 4626 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4774892a-4776-4db1-b74e-d78df11aa97e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.478157 4626 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.560875 4626 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.579378 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"4774892a-4776-4db1-b74e-d78df11aa97e","Type":"ContainerDied","Data":"fe7054b3686175475873f70259ed650445a8d1382a2a1031820607136a312904"} Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.579454 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 23 09:22:49 crc kubenswrapper[4626]: I0223 09:22:49.580206 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe7054b3686175475873f70259ed650445a8d1382a2a1031820607136a312904" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.502678 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 23 09:22:57 crc kubenswrapper[4626]: E0223 09:22:57.507551 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="registry-server" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.507579 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="registry-server" Feb 23 09:22:57 crc kubenswrapper[4626]: E0223 09:22:57.507782 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="extract-utilities" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.507802 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="extract-utilities" Feb 23 09:22:57 crc kubenswrapper[4626]: E0223 09:22:57.507822 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="extract-content" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.507834 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="extract-content" Feb 23 09:22:57 crc kubenswrapper[4626]: E0223 09:22:57.507848 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4774892a-4776-4db1-b74e-d78df11aa97e" containerName="tempest-tests-tempest-tests-runner" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.507854 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4774892a-4776-4db1-b74e-d78df11aa97e" containerName="tempest-tests-tempest-tests-runner" Feb 23 09:22:57 crc kubenswrapper[4626]: E0223 09:22:57.507873 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efee1127-9770-496c-9bad-69d67a149b53" containerName="collect-profiles" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.507880 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="efee1127-9770-496c-9bad-69d67a149b53" containerName="collect-profiles" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.508395 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="efee1127-9770-496c-9bad-69d67a149b53" containerName="collect-profiles" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.508423 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4774892a-4776-4db1-b74e-d78df11aa97e" containerName="tempest-tests-tempest-tests-runner" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.508437 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c683c3-0d2b-430d-b3f7-07ed0a8beb4c" containerName="registry-server" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.510348 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.525408 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-pdr96" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.540718 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.624581 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.624811 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmtbl\" (UniqueName: \"kubernetes.io/projected/d0209d94-e31f-4614-aa4d-e48ed971fcff-kube-api-access-xmtbl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.727147 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmtbl\" (UniqueName: \"kubernetes.io/projected/d0209d94-e31f-4614-aa4d-e48ed971fcff-kube-api-access-xmtbl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.727384 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.730091 4626 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.753031 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.755377 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmtbl\" (UniqueName: \"kubernetes.io/projected/d0209d94-e31f-4614-aa4d-e48ed971fcff-kube-api-access-xmtbl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d0209d94-e31f-4614-aa4d-e48ed971fcff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:57 crc kubenswrapper[4626]: I0223 09:22:57.831181 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 09:22:58 crc kubenswrapper[4626]: I0223 09:22:58.278376 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 23 09:22:58 crc kubenswrapper[4626]: W0223 09:22:58.317164 4626 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0209d94_e31f_4614_aa4d_e48ed971fcff.slice/crio-e73c4225e93f2559d4879b95c7c03b08b7c5eb03f684d3270b5c9c4ca86b93f7 WatchSource:0}: Error finding container e73c4225e93f2559d4879b95c7c03b08b7c5eb03f684d3270b5c9c4ca86b93f7: Status 404 returned error can't find the container with id e73c4225e93f2559d4879b95c7c03b08b7c5eb03f684d3270b5c9c4ca86b93f7 Feb 23 09:22:58 crc kubenswrapper[4626]: I0223 09:22:58.332034 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:22:58 crc kubenswrapper[4626]: I0223 09:22:58.653812 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d0209d94-e31f-4614-aa4d-e48ed971fcff","Type":"ContainerStarted","Data":"e73c4225e93f2559d4879b95c7c03b08b7c5eb03f684d3270b5c9c4ca86b93f7"} Feb 23 09:23:00 crc kubenswrapper[4626]: I0223 09:23:00.673721 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d0209d94-e31f-4614-aa4d-e48ed971fcff","Type":"ContainerStarted","Data":"2d33645f92c01360ce93c13475f6fba88def7dbb2804feb9e2af5ff3d6eb73be"} Feb 23 09:23:00 crc kubenswrapper[4626]: I0223 09:23:00.692943 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.475544221 podStartE2EDuration="3.692918158s" podCreationTimestamp="2026-02-23 09:22:57 +0000 UTC" firstStartedPulling="2026-02-23 09:22:58.328522716 +0000 UTC m=+9730.667851982" lastFinishedPulling="2026-02-23 09:22:59.545896653 +0000 UTC m=+9731.885225919" observedRunningTime="2026-02-23 09:23:00.684271007 +0000 UTC m=+9733.023600273" watchObservedRunningTime="2026-02-23 09:23:00.692918158 +0000 UTC m=+9733.032247424" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.732805 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4kxkp"] Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.737927 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.752969 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kxkp"] Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.822229 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-catalog-content\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.822573 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-utilities\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.822741 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcqck\" (UniqueName: \"kubernetes.io/projected/5df4242c-b640-43a2-bedf-8a6aaae4d38c-kube-api-access-pcqck\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.924608 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-utilities\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.924834 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcqck\" (UniqueName: \"kubernetes.io/projected/5df4242c-b640-43a2-bedf-8a6aaae4d38c-kube-api-access-pcqck\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.924886 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-catalog-content\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.927033 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-utilities\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.927080 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-catalog-content\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:15 crc kubenswrapper[4626]: I0223 09:23:15.953440 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcqck\" (UniqueName: \"kubernetes.io/projected/5df4242c-b640-43a2-bedf-8a6aaae4d38c-kube-api-access-pcqck\") pod \"redhat-operators-4kxkp\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:16 crc kubenswrapper[4626]: I0223 09:23:16.068303 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:16 crc kubenswrapper[4626]: I0223 09:23:16.445618 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4kxkp"] Feb 23 09:23:16 crc kubenswrapper[4626]: I0223 09:23:16.843861 4626 generic.go:334] "Generic (PLEG): container finished" podID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerID="f8f06f795715aa7419b6bdf49e31d621c609113784fcc8c0d9ff63e405387d3b" exitCode=0 Feb 23 09:23:16 crc kubenswrapper[4626]: I0223 09:23:16.844247 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerDied","Data":"f8f06f795715aa7419b6bdf49e31d621c609113784fcc8c0d9ff63e405387d3b"} Feb 23 09:23:16 crc kubenswrapper[4626]: I0223 09:23:16.844282 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerStarted","Data":"d4c1dd0ab8d2e5e55f0ca9e0ebd42453a0f630b76bef3165b20882368317db13"} Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.460528 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vhk9d/must-gather-5dwvn"] Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.462782 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.464784 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vhk9d"/"default-dockercfg-qmtp4" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.465858 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vhk9d"/"openshift-service-ca.crt" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.466853 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vhk9d"/"kube-root-ca.crt" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.472517 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vhk9d/must-gather-5dwvn"] Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.586761 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrkdf\" (UniqueName: \"kubernetes.io/projected/0529f017-1e22-4c12-a8c8-c89be8726d3e-kube-api-access-qrkdf\") pod \"must-gather-5dwvn\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.587392 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0529f017-1e22-4c12-a8c8-c89be8726d3e-must-gather-output\") pod \"must-gather-5dwvn\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.688940 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0529f017-1e22-4c12-a8c8-c89be8726d3e-must-gather-output\") pod \"must-gather-5dwvn\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.688994 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrkdf\" (UniqueName: \"kubernetes.io/projected/0529f017-1e22-4c12-a8c8-c89be8726d3e-kube-api-access-qrkdf\") pod \"must-gather-5dwvn\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.689630 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0529f017-1e22-4c12-a8c8-c89be8726d3e-must-gather-output\") pod \"must-gather-5dwvn\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.710560 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrkdf\" (UniqueName: \"kubernetes.io/projected/0529f017-1e22-4c12-a8c8-c89be8726d3e-kube-api-access-qrkdf\") pod \"must-gather-5dwvn\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.791865 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:23:18 crc kubenswrapper[4626]: I0223 09:23:18.866611 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerStarted","Data":"75e6d7ed485d60402b6a8164e1317c2bdc87024a000a0d8a793e7f7ad52f28c7"} Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.271116 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vhk9d/must-gather-5dwvn"] Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.325917 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4jlld"] Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.328876 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.365626 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jlld"] Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.409418 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-catalog-content\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.409818 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68gf\" (UniqueName: \"kubernetes.io/projected/aef61323-8c8b-4f82-ad09-721f3027dea1-kube-api-access-w68gf\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.409894 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-utilities\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.512443 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-catalog-content\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.513181 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68gf\" (UniqueName: \"kubernetes.io/projected/aef61323-8c8b-4f82-ad09-721f3027dea1-kube-api-access-w68gf\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.513266 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-utilities\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.512952 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-catalog-content\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.514268 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-utilities\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.535441 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68gf\" (UniqueName: \"kubernetes.io/projected/aef61323-8c8b-4f82-ad09-721f3027dea1-kube-api-access-w68gf\") pod \"community-operators-4jlld\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.664147 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.886722 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" event={"ID":"0529f017-1e22-4c12-a8c8-c89be8726d3e","Type":"ContainerStarted","Data":"27f7015006db3bfe02ef586ba49a7e0548a6536408ff73f2ccd6f3869cda164f"} Feb 23 09:23:19 crc kubenswrapper[4626]: I0223 09:23:19.968352 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jlld"] Feb 23 09:23:20 crc kubenswrapper[4626]: I0223 09:23:20.895450 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerStarted","Data":"a085e5fd2c24ff914a82bdc0b519e138a9c79ce3e3a7b33326c72d0879543c46"} Feb 23 09:23:20 crc kubenswrapper[4626]: I0223 09:23:20.895762 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerStarted","Data":"011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5"} Feb 23 09:23:21 crc kubenswrapper[4626]: I0223 09:23:21.925664 4626 generic.go:334] "Generic (PLEG): container finished" podID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerID="a085e5fd2c24ff914a82bdc0b519e138a9c79ce3e3a7b33326c72d0879543c46" exitCode=0 Feb 23 09:23:21 crc kubenswrapper[4626]: I0223 09:23:21.925774 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerDied","Data":"a085e5fd2c24ff914a82bdc0b519e138a9c79ce3e3a7b33326c72d0879543c46"} Feb 23 09:23:21 crc kubenswrapper[4626]: I0223 09:23:21.937624 4626 generic.go:334] "Generic (PLEG): container finished" podID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerID="75e6d7ed485d60402b6a8164e1317c2bdc87024a000a0d8a793e7f7ad52f28c7" exitCode=0 Feb 23 09:23:21 crc kubenswrapper[4626]: I0223 09:23:21.937676 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerDied","Data":"75e6d7ed485d60402b6a8164e1317c2bdc87024a000a0d8a793e7f7ad52f28c7"} Feb 23 09:23:25 crc kubenswrapper[4626]: I0223 09:23:25.686337 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:23:25 crc kubenswrapper[4626]: I0223 09:23:25.687115 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:23:29 crc kubenswrapper[4626]: I0223 09:23:29.025852 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerStarted","Data":"3301dee6e19716bddef6caa193622a2ef2a02bf57cfce426ae949dc9b948fc08"} Feb 23 09:23:29 crc kubenswrapper[4626]: I0223 09:23:29.028249 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" event={"ID":"0529f017-1e22-4c12-a8c8-c89be8726d3e","Type":"ContainerStarted","Data":"093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58"} Feb 23 09:23:29 crc kubenswrapper[4626]: I0223 09:23:29.028313 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" event={"ID":"0529f017-1e22-4c12-a8c8-c89be8726d3e","Type":"ContainerStarted","Data":"226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698"} Feb 23 09:23:29 crc kubenswrapper[4626]: I0223 09:23:29.032636 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerStarted","Data":"654ed146181647b24bdfdddae7ef8cb4e86bd6f9a9da918f99a668cc9c0e17bd"} Feb 23 09:23:29 crc kubenswrapper[4626]: I0223 09:23:29.066556 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4kxkp" podStartSLOduration=2.7079392970000002 podStartE2EDuration="14.066532325s" podCreationTimestamp="2026-02-23 09:23:15 +0000 UTC" firstStartedPulling="2026-02-23 09:23:16.848344868 +0000 UTC m=+9749.187674133" lastFinishedPulling="2026-02-23 09:23:28.206937894 +0000 UTC m=+9760.546267161" observedRunningTime="2026-02-23 09:23:29.046307918 +0000 UTC m=+9761.385637184" watchObservedRunningTime="2026-02-23 09:23:29.066532325 +0000 UTC m=+9761.405861592" Feb 23 09:23:29 crc kubenswrapper[4626]: I0223 09:23:29.108402 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" podStartSLOduration=2.12278843 podStartE2EDuration="11.108369769s" podCreationTimestamp="2026-02-23 09:23:18 +0000 UTC" firstStartedPulling="2026-02-23 09:23:19.275816655 +0000 UTC m=+9751.615145921" lastFinishedPulling="2026-02-23 09:23:28.261397994 +0000 UTC m=+9760.600727260" observedRunningTime="2026-02-23 09:23:29.083955232 +0000 UTC m=+9761.423284499" watchObservedRunningTime="2026-02-23 09:23:29.108369769 +0000 UTC m=+9761.447699034" Feb 23 09:23:30 crc kubenswrapper[4626]: I0223 09:23:30.042362 4626 generic.go:334] "Generic (PLEG): container finished" podID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerID="654ed146181647b24bdfdddae7ef8cb4e86bd6f9a9da918f99a668cc9c0e17bd" exitCode=0 Feb 23 09:23:30 crc kubenswrapper[4626]: I0223 09:23:30.042464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerDied","Data":"654ed146181647b24bdfdddae7ef8cb4e86bd6f9a9da918f99a668cc9c0e17bd"} Feb 23 09:23:31 crc kubenswrapper[4626]: I0223 09:23:31.054331 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerStarted","Data":"673e32fec8d04e190adf70fb0d9fe982e5b5d8bbb2e59fac5d6f19f0637617f6"} Feb 23 09:23:31 crc kubenswrapper[4626]: I0223 09:23:31.092799 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4jlld" podStartSLOduration=3.457748202 podStartE2EDuration="12.092782921s" podCreationTimestamp="2026-02-23 09:23:19 +0000 UTC" firstStartedPulling="2026-02-23 09:23:21.931729105 +0000 UTC m=+9754.271058371" lastFinishedPulling="2026-02-23 09:23:30.566763824 +0000 UTC m=+9762.906093090" observedRunningTime="2026-02-23 09:23:31.086161637 +0000 UTC m=+9763.425490903" watchObservedRunningTime="2026-02-23 09:23:31.092782921 +0000 UTC m=+9763.432112186" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.570592 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-v2sbf"] Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.572250 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.728127 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/b512a4b3-6fbb-46db-a379-f42c1cef7bee-kube-api-access-6gpt5\") pod \"crc-debug-v2sbf\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.728697 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b512a4b3-6fbb-46db-a379-f42c1cef7bee-host\") pod \"crc-debug-v2sbf\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.830882 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/b512a4b3-6fbb-46db-a379-f42c1cef7bee-kube-api-access-6gpt5\") pod \"crc-debug-v2sbf\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.831235 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b512a4b3-6fbb-46db-a379-f42c1cef7bee-host\") pod \"crc-debug-v2sbf\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.831904 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b512a4b3-6fbb-46db-a379-f42c1cef7bee-host\") pod \"crc-debug-v2sbf\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.853959 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/b512a4b3-6fbb-46db-a379-f42c1cef7bee-kube-api-access-6gpt5\") pod \"crc-debug-v2sbf\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:34 crc kubenswrapper[4626]: I0223 09:23:34.891480 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:23:35 crc kubenswrapper[4626]: I0223 09:23:35.104604 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" event={"ID":"b512a4b3-6fbb-46db-a379-f42c1cef7bee","Type":"ContainerStarted","Data":"42dd6260065eb73947bc2946d55e64cfa013aaabd32c241c72a04c792ff8edd6"} Feb 23 09:23:36 crc kubenswrapper[4626]: I0223 09:23:36.068776 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:36 crc kubenswrapper[4626]: I0223 09:23:36.069537 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:37 crc kubenswrapper[4626]: I0223 09:23:37.135637 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4kxkp" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="registry-server" probeResult="failure" output=< Feb 23 09:23:37 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:23:37 crc kubenswrapper[4626]: > Feb 23 09:23:39 crc kubenswrapper[4626]: I0223 09:23:39.664688 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:39 crc kubenswrapper[4626]: I0223 09:23:39.665335 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:39 crc kubenswrapper[4626]: I0223 09:23:39.735710 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:40 crc kubenswrapper[4626]: I0223 09:23:40.227869 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:40 crc kubenswrapper[4626]: I0223 09:23:40.289653 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jlld"] Feb 23 09:23:42 crc kubenswrapper[4626]: I0223 09:23:42.192476 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4jlld" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="registry-server" containerID="cri-o://673e32fec8d04e190adf70fb0d9fe982e5b5d8bbb2e59fac5d6f19f0637617f6" gracePeriod=2 Feb 23 09:23:43 crc kubenswrapper[4626]: I0223 09:23:43.212406 4626 generic.go:334] "Generic (PLEG): container finished" podID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerID="673e32fec8d04e190adf70fb0d9fe982e5b5d8bbb2e59fac5d6f19f0637617f6" exitCode=0 Feb 23 09:23:43 crc kubenswrapper[4626]: I0223 09:23:43.212687 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerDied","Data":"673e32fec8d04e190adf70fb0d9fe982e5b5d8bbb2e59fac5d6f19f0637617f6"} Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.115946 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4kxkp" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="registry-server" probeResult="failure" output=< Feb 23 09:23:47 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:23:47 crc kubenswrapper[4626]: > Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.265161 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" event={"ID":"b512a4b3-6fbb-46db-a379-f42c1cef7bee","Type":"ContainerStarted","Data":"24f55ce3e1acda781f196e9163e82f87d89bef681187e82302878fa911662276"} Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.676870 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.710748 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" podStartSLOduration=1.6845781309999999 podStartE2EDuration="13.710730269s" podCreationTimestamp="2026-02-23 09:23:34 +0000 UTC" firstStartedPulling="2026-02-23 09:23:34.933669547 +0000 UTC m=+9767.272998813" lastFinishedPulling="2026-02-23 09:23:46.959821685 +0000 UTC m=+9779.299150951" observedRunningTime="2026-02-23 09:23:47.287806661 +0000 UTC m=+9779.627135927" watchObservedRunningTime="2026-02-23 09:23:47.710730269 +0000 UTC m=+9780.050059535" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.804305 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w68gf\" (UniqueName: \"kubernetes.io/projected/aef61323-8c8b-4f82-ad09-721f3027dea1-kube-api-access-w68gf\") pod \"aef61323-8c8b-4f82-ad09-721f3027dea1\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.804516 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-utilities\") pod \"aef61323-8c8b-4f82-ad09-721f3027dea1\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.804611 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-catalog-content\") pod \"aef61323-8c8b-4f82-ad09-721f3027dea1\" (UID: \"aef61323-8c8b-4f82-ad09-721f3027dea1\") " Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.807004 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-utilities" (OuterVolumeSpecName: "utilities") pod "aef61323-8c8b-4f82-ad09-721f3027dea1" (UID: "aef61323-8c8b-4f82-ad09-721f3027dea1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.815008 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef61323-8c8b-4f82-ad09-721f3027dea1-kube-api-access-w68gf" (OuterVolumeSpecName: "kube-api-access-w68gf") pod "aef61323-8c8b-4f82-ad09-721f3027dea1" (UID: "aef61323-8c8b-4f82-ad09-721f3027dea1"). InnerVolumeSpecName "kube-api-access-w68gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.842527 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aef61323-8c8b-4f82-ad09-721f3027dea1" (UID: "aef61323-8c8b-4f82-ad09-721f3027dea1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.907031 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w68gf\" (UniqueName: \"kubernetes.io/projected/aef61323-8c8b-4f82-ad09-721f3027dea1-kube-api-access-w68gf\") on node \"crc\" DevicePath \"\"" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.907059 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:23:47 crc kubenswrapper[4626]: I0223 09:23:47.907069 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef61323-8c8b-4f82-ad09-721f3027dea1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.279228 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jlld" event={"ID":"aef61323-8c8b-4f82-ad09-721f3027dea1","Type":"ContainerDied","Data":"011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5"} Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.279284 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jlld" Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.280116 4626 scope.go:117] "RemoveContainer" containerID="673e32fec8d04e190adf70fb0d9fe982e5b5d8bbb2e59fac5d6f19f0637617f6" Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.314906 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jlld"] Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.320898 4626 scope.go:117] "RemoveContainer" containerID="654ed146181647b24bdfdddae7ef8cb4e86bd6f9a9da918f99a668cc9c0e17bd" Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.323433 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4jlld"] Feb 23 09:23:48 crc kubenswrapper[4626]: I0223 09:23:48.506648 4626 scope.go:117] "RemoveContainer" containerID="a085e5fd2c24ff914a82bdc0b519e138a9c79ce3e3a7b33326c72d0879543c46" Feb 23 09:23:49 crc kubenswrapper[4626]: I0223 09:23:49.992483 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" path="/var/lib/kubelet/pods/aef61323-8c8b-4f82-ad09-721f3027dea1/volumes" Feb 23 09:23:55 crc kubenswrapper[4626]: I0223 09:23:55.685257 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:23:55 crc kubenswrapper[4626]: I0223 09:23:55.685807 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:23:56 crc kubenswrapper[4626]: I0223 09:23:56.128540 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:56 crc kubenswrapper[4626]: I0223 09:23:56.178697 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:56 crc kubenswrapper[4626]: I0223 09:23:56.371927 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kxkp"] Feb 23 09:23:56 crc kubenswrapper[4626]: E0223 09:23:56.954874 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice/crio-011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5\": RecentStats: unable to find data in memory cache]" Feb 23 09:23:57 crc kubenswrapper[4626]: I0223 09:23:57.362776 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4kxkp" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="registry-server" containerID="cri-o://3301dee6e19716bddef6caa193622a2ef2a02bf57cfce426ae949dc9b948fc08" gracePeriod=2 Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.378002 4626 generic.go:334] "Generic (PLEG): container finished" podID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerID="3301dee6e19716bddef6caa193622a2ef2a02bf57cfce426ae949dc9b948fc08" exitCode=0 Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.378088 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerDied","Data":"3301dee6e19716bddef6caa193622a2ef2a02bf57cfce426ae949dc9b948fc08"} Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.378573 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4kxkp" event={"ID":"5df4242c-b640-43a2-bedf-8a6aaae4d38c","Type":"ContainerDied","Data":"d4c1dd0ab8d2e5e55f0ca9e0ebd42453a0f630b76bef3165b20882368317db13"} Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.379036 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4c1dd0ab8d2e5e55f0ca9e0ebd42453a0f630b76bef3165b20882368317db13" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.381472 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.460711 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcqck\" (UniqueName: \"kubernetes.io/projected/5df4242c-b640-43a2-bedf-8a6aaae4d38c-kube-api-access-pcqck\") pod \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.460946 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-catalog-content\") pod \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.461058 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-utilities\") pod \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\" (UID: \"5df4242c-b640-43a2-bedf-8a6aaae4d38c\") " Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.462285 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-utilities" (OuterVolumeSpecName: "utilities") pod "5df4242c-b640-43a2-bedf-8a6aaae4d38c" (UID: "5df4242c-b640-43a2-bedf-8a6aaae4d38c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.478789 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df4242c-b640-43a2-bedf-8a6aaae4d38c-kube-api-access-pcqck" (OuterVolumeSpecName: "kube-api-access-pcqck") pod "5df4242c-b640-43a2-bedf-8a6aaae4d38c" (UID: "5df4242c-b640-43a2-bedf-8a6aaae4d38c"). InnerVolumeSpecName "kube-api-access-pcqck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.565227 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcqck\" (UniqueName: \"kubernetes.io/projected/5df4242c-b640-43a2-bedf-8a6aaae4d38c-kube-api-access-pcqck\") on node \"crc\" DevicePath \"\"" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.565344 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.586379 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5df4242c-b640-43a2-bedf-8a6aaae4d38c" (UID: "5df4242c-b640-43a2-bedf-8a6aaae4d38c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:23:58 crc kubenswrapper[4626]: I0223 09:23:58.667622 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5df4242c-b640-43a2-bedf-8a6aaae4d38c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:23:59 crc kubenswrapper[4626]: I0223 09:23:59.387340 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4kxkp" Feb 23 09:23:59 crc kubenswrapper[4626]: I0223 09:23:59.425544 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4kxkp"] Feb 23 09:23:59 crc kubenswrapper[4626]: I0223 09:23:59.434005 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4kxkp"] Feb 23 09:23:59 crc kubenswrapper[4626]: I0223 09:23:59.993216 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" path="/var/lib/kubelet/pods/5df4242c-b640-43a2-bedf-8a6aaae4d38c/volumes" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.813051 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dgh4b"] Feb 23 09:24:01 crc kubenswrapper[4626]: E0223 09:24:01.822461 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="registry-server" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.822517 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="registry-server" Feb 23 09:24:01 crc kubenswrapper[4626]: E0223 09:24:01.822565 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="extract-utilities" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.822576 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="extract-utilities" Feb 23 09:24:01 crc kubenswrapper[4626]: E0223 09:24:01.822609 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="extract-utilities" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.822615 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="extract-utilities" Feb 23 09:24:01 crc kubenswrapper[4626]: E0223 09:24:01.822626 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="registry-server" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.822633 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="registry-server" Feb 23 09:24:01 crc kubenswrapper[4626]: E0223 09:24:01.822658 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="extract-content" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.822667 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="extract-content" Feb 23 09:24:01 crc kubenswrapper[4626]: E0223 09:24:01.822684 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="extract-content" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.822690 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="extract-content" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.823143 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df4242c-b640-43a2-bedf-8a6aaae4d38c" containerName="registry-server" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.823184 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef61323-8c8b-4f82-ad09-721f3027dea1" containerName="registry-server" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.825116 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.855473 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgh4b"] Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.963670 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-utilities\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.964213 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsnm\" (UniqueName: \"kubernetes.io/projected/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-kube-api-access-rbsnm\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:01 crc kubenswrapper[4626]: I0223 09:24:01.965140 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-catalog-content\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.067408 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsnm\" (UniqueName: \"kubernetes.io/projected/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-kube-api-access-rbsnm\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.067553 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-catalog-content\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.067597 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-utilities\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.068448 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-catalog-content\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.068466 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-utilities\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.096330 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsnm\" (UniqueName: \"kubernetes.io/projected/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-kube-api-access-rbsnm\") pod \"certified-operators-dgh4b\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.195616 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:02 crc kubenswrapper[4626]: I0223 09:24:02.821998 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dgh4b"] Feb 23 09:24:03 crc kubenswrapper[4626]: I0223 09:24:03.455404 4626 generic.go:334] "Generic (PLEG): container finished" podID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerID="7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258" exitCode=0 Feb 23 09:24:03 crc kubenswrapper[4626]: I0223 09:24:03.455843 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerDied","Data":"7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258"} Feb 23 09:24:03 crc kubenswrapper[4626]: I0223 09:24:03.455885 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerStarted","Data":"9138ea8cceb933cdaca5c2ef4d2abb573a8ab6f5847b5844de1c60c8601d3bd6"} Feb 23 09:24:05 crc kubenswrapper[4626]: I0223 09:24:05.481236 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerStarted","Data":"d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c"} Feb 23 09:24:06 crc kubenswrapper[4626]: I0223 09:24:06.538769 4626 generic.go:334] "Generic (PLEG): container finished" podID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerID="d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c" exitCode=0 Feb 23 09:24:06 crc kubenswrapper[4626]: I0223 09:24:06.546596 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerDied","Data":"d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c"} Feb 23 09:24:07 crc kubenswrapper[4626]: E0223 09:24:07.209195 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice/crio-011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice\": RecentStats: unable to find data in memory cache]" Feb 23 09:24:07 crc kubenswrapper[4626]: I0223 09:24:07.559747 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerStarted","Data":"4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed"} Feb 23 09:24:07 crc kubenswrapper[4626]: I0223 09:24:07.587434 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dgh4b" podStartSLOduration=3.006461127 podStartE2EDuration="6.587408028s" podCreationTimestamp="2026-02-23 09:24:01 +0000 UTC" firstStartedPulling="2026-02-23 09:24:03.458753403 +0000 UTC m=+9795.798082669" lastFinishedPulling="2026-02-23 09:24:07.039700303 +0000 UTC m=+9799.379029570" observedRunningTime="2026-02-23 09:24:07.586519342 +0000 UTC m=+9799.925848608" watchObservedRunningTime="2026-02-23 09:24:07.587408028 +0000 UTC m=+9799.926737294" Feb 23 09:24:12 crc kubenswrapper[4626]: I0223 09:24:12.196599 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:12 crc kubenswrapper[4626]: I0223 09:24:12.197115 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:13 crc kubenswrapper[4626]: I0223 09:24:13.230312 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dgh4b" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="registry-server" probeResult="failure" output=< Feb 23 09:24:13 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:24:13 crc kubenswrapper[4626]: > Feb 23 09:24:17 crc kubenswrapper[4626]: E0223 09:24:17.455844 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice/crio-011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice\": RecentStats: unable to find data in memory cache]" Feb 23 09:24:22 crc kubenswrapper[4626]: I0223 09:24:22.238370 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:22 crc kubenswrapper[4626]: I0223 09:24:22.286028 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:22 crc kubenswrapper[4626]: I0223 09:24:22.476651 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dgh4b"] Feb 23 09:24:23 crc kubenswrapper[4626]: I0223 09:24:23.716410 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dgh4b" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="registry-server" containerID="cri-o://4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed" gracePeriod=2 Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.204056 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.270166 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-utilities\") pod \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.270230 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbsnm\" (UniqueName: \"kubernetes.io/projected/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-kube-api-access-rbsnm\") pod \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.270310 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-catalog-content\") pod \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\" (UID: \"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2\") " Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.272925 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-utilities" (OuterVolumeSpecName: "utilities") pod "4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" (UID: "4b33f7c0-649f-4bbf-b525-bab3e8d33fc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.308291 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-kube-api-access-rbsnm" (OuterVolumeSpecName: "kube-api-access-rbsnm") pod "4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" (UID: "4b33f7c0-649f-4bbf-b525-bab3e8d33fc2"). InnerVolumeSpecName "kube-api-access-rbsnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.347653 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" (UID: "4b33f7c0-649f-4bbf-b525-bab3e8d33fc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.375296 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.375331 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbsnm\" (UniqueName: \"kubernetes.io/projected/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-kube-api-access-rbsnm\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.375344 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.727860 4626 generic.go:334] "Generic (PLEG): container finished" podID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerID="4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed" exitCode=0 Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.727914 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerDied","Data":"4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed"} Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.727999 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dgh4b" event={"ID":"4b33f7c0-649f-4bbf-b525-bab3e8d33fc2","Type":"ContainerDied","Data":"9138ea8cceb933cdaca5c2ef4d2abb573a8ab6f5847b5844de1c60c8601d3bd6"} Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.727998 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dgh4b" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.728042 4626 scope.go:117] "RemoveContainer" containerID="4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.775491 4626 scope.go:117] "RemoveContainer" containerID="d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.779272 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dgh4b"] Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.807870 4626 scope.go:117] "RemoveContainer" containerID="7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.807961 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dgh4b"] Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.855787 4626 scope.go:117] "RemoveContainer" containerID="4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed" Feb 23 09:24:24 crc kubenswrapper[4626]: E0223 09:24:24.860919 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed\": container with ID starting with 4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed not found: ID does not exist" containerID="4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.861098 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed"} err="failed to get container status \"4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed\": rpc error: code = NotFound desc = could not find container \"4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed\": container with ID starting with 4dd24e98cb5f02a41614b7178affb8f430517a5a3d2cbd46f7ae7069e16c6bed not found: ID does not exist" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.861195 4626 scope.go:117] "RemoveContainer" containerID="d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c" Feb 23 09:24:24 crc kubenswrapper[4626]: E0223 09:24:24.863026 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c\": container with ID starting with d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c not found: ID does not exist" containerID="d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.863146 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c"} err="failed to get container status \"d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c\": rpc error: code = NotFound desc = could not find container \"d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c\": container with ID starting with d24f93ee4efd654c67b15001790a9d08179ea8cb66dda40845bdab3f4c664c7c not found: ID does not exist" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.863222 4626 scope.go:117] "RemoveContainer" containerID="7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258" Feb 23 09:24:24 crc kubenswrapper[4626]: E0223 09:24:24.863853 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258\": container with ID starting with 7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258 not found: ID does not exist" containerID="7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258" Feb 23 09:24:24 crc kubenswrapper[4626]: I0223 09:24:24.863917 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258"} err="failed to get container status \"7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258\": rpc error: code = NotFound desc = could not find container \"7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258\": container with ID starting with 7366d8b78019977657b8b6fd908c207b239da52913c40a6de08d38879a59e258 not found: ID does not exist" Feb 23 09:24:25 crc kubenswrapper[4626]: I0223 09:24:25.685328 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:24:25 crc kubenswrapper[4626]: I0223 09:24:25.685388 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:24:25 crc kubenswrapper[4626]: I0223 09:24:25.685436 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:24:25 crc kubenswrapper[4626]: I0223 09:24:25.685959 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e2319dd85ff58f417e9b19da900255d78470cf0b6837778a9207fb2e2656ffd"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:24:25 crc kubenswrapper[4626]: I0223 09:24:25.686011 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://0e2319dd85ff58f417e9b19da900255d78470cf0b6837778a9207fb2e2656ffd" gracePeriod=600 Feb 23 09:24:25 crc kubenswrapper[4626]: I0223 09:24:25.991399 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" path="/var/lib/kubelet/pods/4b33f7c0-649f-4bbf-b525-bab3e8d33fc2/volumes" Feb 23 09:24:26 crc kubenswrapper[4626]: I0223 09:24:26.751392 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="0e2319dd85ff58f417e9b19da900255d78470cf0b6837778a9207fb2e2656ffd" exitCode=0 Feb 23 09:24:26 crc kubenswrapper[4626]: I0223 09:24:26.751451 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"0e2319dd85ff58f417e9b19da900255d78470cf0b6837778a9207fb2e2656ffd"} Feb 23 09:24:26 crc kubenswrapper[4626]: I0223 09:24:26.751854 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0"} Feb 23 09:24:26 crc kubenswrapper[4626]: I0223 09:24:26.751903 4626 scope.go:117] "RemoveContainer" containerID="88670475e1778e966f9138e608d8af17c67a6f384e8f15e9ffbe7d711aa1d185" Feb 23 09:24:27 crc kubenswrapper[4626]: E0223 09:24:27.716462 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice/crio-011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5\": RecentStats: unable to find data in memory cache]" Feb 23 09:24:36 crc kubenswrapper[4626]: I0223 09:24:36.841163 4626 generic.go:334] "Generic (PLEG): container finished" podID="b512a4b3-6fbb-46db-a379-f42c1cef7bee" containerID="24f55ce3e1acda781f196e9163e82f87d89bef681187e82302878fa911662276" exitCode=0 Feb 23 09:24:36 crc kubenswrapper[4626]: I0223 09:24:36.841332 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" event={"ID":"b512a4b3-6fbb-46db-a379-f42c1cef7bee","Type":"ContainerDied","Data":"24f55ce3e1acda781f196e9163e82f87d89bef681187e82302878fa911662276"} Feb 23 09:24:37 crc kubenswrapper[4626]: I0223 09:24:37.945386 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:24:37 crc kubenswrapper[4626]: E0223 09:24:37.952656 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef61323_8c8b_4f82_ad09_721f3027dea1.slice/crio-011a17cf4ba711aa7ec67c87aa19765b1c376598950e81cbbb2cccece1b70dd5\": RecentStats: unable to find data in memory cache]" Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.028330 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-v2sbf"] Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.035091 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-v2sbf"] Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.117371 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b512a4b3-6fbb-46db-a379-f42c1cef7bee-host\") pod \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.117594 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/b512a4b3-6fbb-46db-a379-f42c1cef7bee-kube-api-access-6gpt5\") pod \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\" (UID: \"b512a4b3-6fbb-46db-a379-f42c1cef7bee\") " Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.118124 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b512a4b3-6fbb-46db-a379-f42c1cef7bee-host" (OuterVolumeSpecName: "host") pod "b512a4b3-6fbb-46db-a379-f42c1cef7bee" (UID: "b512a4b3-6fbb-46db-a379-f42c1cef7bee"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.118819 4626 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b512a4b3-6fbb-46db-a379-f42c1cef7bee-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.135780 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b512a4b3-6fbb-46db-a379-f42c1cef7bee-kube-api-access-6gpt5" (OuterVolumeSpecName: "kube-api-access-6gpt5") pod "b512a4b3-6fbb-46db-a379-f42c1cef7bee" (UID: "b512a4b3-6fbb-46db-a379-f42c1cef7bee"). InnerVolumeSpecName "kube-api-access-6gpt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.222092 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/b512a4b3-6fbb-46db-a379-f42c1cef7bee-kube-api-access-6gpt5\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.872310 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42dd6260065eb73947bc2946d55e64cfa013aaabd32c241c72a04c792ff8edd6" Feb 23 09:24:38 crc kubenswrapper[4626]: I0223 09:24:38.873161 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-v2sbf" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.183268 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-w6xdd"] Feb 23 09:24:39 crc kubenswrapper[4626]: E0223 09:24:39.184851 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="extract-content" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.184960 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="extract-content" Feb 23 09:24:39 crc kubenswrapper[4626]: E0223 09:24:39.185062 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="extract-utilities" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.185130 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="extract-utilities" Feb 23 09:24:39 crc kubenswrapper[4626]: E0223 09:24:39.185192 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b512a4b3-6fbb-46db-a379-f42c1cef7bee" containerName="container-00" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.185262 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="b512a4b3-6fbb-46db-a379-f42c1cef7bee" containerName="container-00" Feb 23 09:24:39 crc kubenswrapper[4626]: E0223 09:24:39.185298 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="registry-server" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.185305 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="registry-server" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.185708 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="b512a4b3-6fbb-46db-a379-f42c1cef7bee" containerName="container-00" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.185797 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b33f7c0-649f-4bbf-b525-bab3e8d33fc2" containerName="registry-server" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.186554 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.347866 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0ed597d-a294-4e24-86de-6a4720dc46a4-host\") pod \"crc-debug-w6xdd\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.348691 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87g78\" (UniqueName: \"kubernetes.io/projected/d0ed597d-a294-4e24-86de-6a4720dc46a4-kube-api-access-87g78\") pod \"crc-debug-w6xdd\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.450142 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0ed597d-a294-4e24-86de-6a4720dc46a4-host\") pod \"crc-debug-w6xdd\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.450328 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87g78\" (UniqueName: \"kubernetes.io/projected/d0ed597d-a294-4e24-86de-6a4720dc46a4-kube-api-access-87g78\") pod \"crc-debug-w6xdd\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.450526 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0ed597d-a294-4e24-86de-6a4720dc46a4-host\") pod \"crc-debug-w6xdd\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.694039 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87g78\" (UniqueName: \"kubernetes.io/projected/d0ed597d-a294-4e24-86de-6a4720dc46a4-kube-api-access-87g78\") pod \"crc-debug-w6xdd\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.803725 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.881672 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" event={"ID":"d0ed597d-a294-4e24-86de-6a4720dc46a4","Type":"ContainerStarted","Data":"81106c77e79bf39a0006e82dbaae655aaf39f0f75041681fc1c54e4486a1a5c4"} Feb 23 09:24:39 crc kubenswrapper[4626]: I0223 09:24:39.992638 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b512a4b3-6fbb-46db-a379-f42c1cef7bee" path="/var/lib/kubelet/pods/b512a4b3-6fbb-46db-a379-f42c1cef7bee/volumes" Feb 23 09:24:40 crc kubenswrapper[4626]: I0223 09:24:40.896264 4626 generic.go:334] "Generic (PLEG): container finished" podID="d0ed597d-a294-4e24-86de-6a4720dc46a4" containerID="ee57486ae36ab510e7e4f0930c1178347a7db1a0254fb52905789958f8fee157" exitCode=0 Feb 23 09:24:40 crc kubenswrapper[4626]: I0223 09:24:40.896329 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" event={"ID":"d0ed597d-a294-4e24-86de-6a4720dc46a4","Type":"ContainerDied","Data":"ee57486ae36ab510e7e4f0930c1178347a7db1a0254fb52905789958f8fee157"} Feb 23 09:24:41 crc kubenswrapper[4626]: I0223 09:24:41.990949 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.111693 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87g78\" (UniqueName: \"kubernetes.io/projected/d0ed597d-a294-4e24-86de-6a4720dc46a4-kube-api-access-87g78\") pod \"d0ed597d-a294-4e24-86de-6a4720dc46a4\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.111867 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0ed597d-a294-4e24-86de-6a4720dc46a4-host\") pod \"d0ed597d-a294-4e24-86de-6a4720dc46a4\" (UID: \"d0ed597d-a294-4e24-86de-6a4720dc46a4\") " Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.115945 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0ed597d-a294-4e24-86de-6a4720dc46a4-host" (OuterVolumeSpecName: "host") pod "d0ed597d-a294-4e24-86de-6a4720dc46a4" (UID: "d0ed597d-a294-4e24-86de-6a4720dc46a4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.134855 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ed597d-a294-4e24-86de-6a4720dc46a4-kube-api-access-87g78" (OuterVolumeSpecName: "kube-api-access-87g78") pod "d0ed597d-a294-4e24-86de-6a4720dc46a4" (UID: "d0ed597d-a294-4e24-86de-6a4720dc46a4"). InnerVolumeSpecName "kube-api-access-87g78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.216100 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87g78\" (UniqueName: \"kubernetes.io/projected/d0ed597d-a294-4e24-86de-6a4720dc46a4-kube-api-access-87g78\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.216143 4626 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0ed597d-a294-4e24-86de-6a4720dc46a4-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.927886 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" event={"ID":"d0ed597d-a294-4e24-86de-6a4720dc46a4","Type":"ContainerDied","Data":"81106c77e79bf39a0006e82dbaae655aaf39f0f75041681fc1c54e4486a1a5c4"} Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.927937 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81106c77e79bf39a0006e82dbaae655aaf39f0f75041681fc1c54e4486a1a5c4" Feb 23 09:24:42 crc kubenswrapper[4626]: I0223 09:24:42.928012 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-w6xdd" Feb 23 09:24:43 crc kubenswrapper[4626]: I0223 09:24:43.214509 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-w6xdd"] Feb 23 09:24:43 crc kubenswrapper[4626]: I0223 09:24:43.224871 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-w6xdd"] Feb 23 09:24:43 crc kubenswrapper[4626]: I0223 09:24:43.993618 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ed597d-a294-4e24-86de-6a4720dc46a4" path="/var/lib/kubelet/pods/d0ed597d-a294-4e24-86de-6a4720dc46a4/volumes" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.402964 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-n8sbn"] Feb 23 09:24:44 crc kubenswrapper[4626]: E0223 09:24:44.403642 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ed597d-a294-4e24-86de-6a4720dc46a4" containerName="container-00" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.403660 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ed597d-a294-4e24-86de-6a4720dc46a4" containerName="container-00" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.404121 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ed597d-a294-4e24-86de-6a4720dc46a4" containerName="container-00" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.405554 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.571763 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08200018-1237-46b0-b1c2-2b3910b6fe44-host\") pod \"crc-debug-n8sbn\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.571850 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckxj\" (UniqueName: \"kubernetes.io/projected/08200018-1237-46b0-b1c2-2b3910b6fe44-kube-api-access-4ckxj\") pod \"crc-debug-n8sbn\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.675600 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08200018-1237-46b0-b1c2-2b3910b6fe44-host\") pod \"crc-debug-n8sbn\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.675700 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckxj\" (UniqueName: \"kubernetes.io/projected/08200018-1237-46b0-b1c2-2b3910b6fe44-kube-api-access-4ckxj\") pod \"crc-debug-n8sbn\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.675757 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08200018-1237-46b0-b1c2-2b3910b6fe44-host\") pod \"crc-debug-n8sbn\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.704167 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckxj\" (UniqueName: \"kubernetes.io/projected/08200018-1237-46b0-b1c2-2b3910b6fe44-kube-api-access-4ckxj\") pod \"crc-debug-n8sbn\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.723583 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.964167 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" event={"ID":"08200018-1237-46b0-b1c2-2b3910b6fe44","Type":"ContainerStarted","Data":"d17906e70fdf0df64dc9cd08b4596ab51c44dbfcb3f4525bc4d1ac9c7638c3ae"} Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.964407 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" event={"ID":"08200018-1237-46b0-b1c2-2b3910b6fe44","Type":"ContainerStarted","Data":"8036e79ab342820be358b520550638d56817a96e6844793a40c79bfbb1ee1649"} Feb 23 09:24:44 crc kubenswrapper[4626]: I0223 09:24:44.978865 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" podStartSLOduration=0.978851731 podStartE2EDuration="978.851731ms" podCreationTimestamp="2026-02-23 09:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:24:44.977946274 +0000 UTC m=+9837.317275540" watchObservedRunningTime="2026-02-23 09:24:44.978851731 +0000 UTC m=+9837.318180997" Feb 23 09:24:45 crc kubenswrapper[4626]: I0223 09:24:45.973428 4626 generic.go:334] "Generic (PLEG): container finished" podID="08200018-1237-46b0-b1c2-2b3910b6fe44" containerID="d17906e70fdf0df64dc9cd08b4596ab51c44dbfcb3f4525bc4d1ac9c7638c3ae" exitCode=0 Feb 23 09:24:45 crc kubenswrapper[4626]: I0223 09:24:45.973534 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" event={"ID":"08200018-1237-46b0-b1c2-2b3910b6fe44","Type":"ContainerDied","Data":"d17906e70fdf0df64dc9cd08b4596ab51c44dbfcb3f4525bc4d1ac9c7638c3ae"} Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.076440 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.104648 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-n8sbn"] Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.110438 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vhk9d/crc-debug-n8sbn"] Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.246101 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08200018-1237-46b0-b1c2-2b3910b6fe44-host\") pod \"08200018-1237-46b0-b1c2-2b3910b6fe44\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.246162 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ckxj\" (UniqueName: \"kubernetes.io/projected/08200018-1237-46b0-b1c2-2b3910b6fe44-kube-api-access-4ckxj\") pod \"08200018-1237-46b0-b1c2-2b3910b6fe44\" (UID: \"08200018-1237-46b0-b1c2-2b3910b6fe44\") " Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.246394 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08200018-1237-46b0-b1c2-2b3910b6fe44-host" (OuterVolumeSpecName: "host") pod "08200018-1237-46b0-b1c2-2b3910b6fe44" (UID: "08200018-1237-46b0-b1c2-2b3910b6fe44"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.246950 4626 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08200018-1237-46b0-b1c2-2b3910b6fe44-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.258929 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08200018-1237-46b0-b1c2-2b3910b6fe44-kube-api-access-4ckxj" (OuterVolumeSpecName: "kube-api-access-4ckxj") pod "08200018-1237-46b0-b1c2-2b3910b6fe44" (UID: "08200018-1237-46b0-b1c2-2b3910b6fe44"). InnerVolumeSpecName "kube-api-access-4ckxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.348800 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ckxj\" (UniqueName: \"kubernetes.io/projected/08200018-1237-46b0-b1c2-2b3910b6fe44-kube-api-access-4ckxj\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.991402 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08200018-1237-46b0-b1c2-2b3910b6fe44" path="/var/lib/kubelet/pods/08200018-1237-46b0-b1c2-2b3910b6fe44/volumes" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.993129 4626 scope.go:117] "RemoveContainer" containerID="d17906e70fdf0df64dc9cd08b4596ab51c44dbfcb3f4525bc4d1ac9c7638c3ae" Feb 23 09:24:47 crc kubenswrapper[4626]: I0223 09:24:47.993174 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/crc-debug-n8sbn" Feb 23 09:24:48 crc kubenswrapper[4626]: E0223 09:24:48.222490 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08200018_1237_46b0_b1c2_2b3910b6fe44.slice/crio-8036e79ab342820be358b520550638d56817a96e6844793a40c79bfbb1ee1649\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08200018_1237_46b0_b1c2_2b3910b6fe44.slice\": RecentStats: unable to find data in memory cache]" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.306515 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6958fdb966-vkk9n_e591795d-67ce-48d5-a54e-2f989878eca9/barbican-api/0.log" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.446325 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6958fdb966-vkk9n_e591795d-67ce-48d5-a54e-2f989878eca9/barbican-api-log/0.log" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.540659 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6495996568-xfqgf_3bd22d53-a38c-4579-b6fd-e7934e32ca47/barbican-keystone-listener/0.log" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.715595 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6495996568-xfqgf_3bd22d53-a38c-4579-b6fd-e7934e32ca47/barbican-keystone-listener-log/0.log" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.832142 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7ccd97cd69-bpkw8_6ec78427-155b-4ed6-8d16-e56f099473c1/barbican-worker-log/0.log" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.836335 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7ccd97cd69-bpkw8_6ec78427-155b-4ed6-8d16-e56f099473c1/barbican-worker/0.log" Feb 23 09:25:06 crc kubenswrapper[4626]: I0223 09:25:06.994672 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6_5d5da7da-f1d4-4a24-9a9b-e22d85625761/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.110680 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/ceilometer-central-agent/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.247026 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/proxy-httpd/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.274722 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/ceilometer-notification-agent/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.335046 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/sg-core/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.514599 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5f9731f-2161-4757-97a7-e542f744362c/cinder-api-log/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.627422 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5f9731f-2161-4757-97a7-e542f744362c/cinder-api/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.747042 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8/cinder-scheduler/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.852730 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8/probe/0.log" Feb 23 09:25:07 crc kubenswrapper[4626]: I0223 09:25:07.974309 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn_17b6f47d-57a1-46e9-be66-1f93b98664c3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.131709 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zckcd_1ea04624-3b44-4b2b-b89d-7799440e264f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.260216 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7cdb55cb5c-2pfm4_5f9703d1-1761-47e8-8524-c52def1bcac3/init/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.461415 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7cdb55cb5c-2pfm4_5f9703d1-1761-47e8-8524-c52def1bcac3/init/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.535759 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k_b968ab81-8b5f-49c7-830b-220b90d6b1f1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.686626 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7cdb55cb5c-2pfm4_5f9703d1-1761-47e8-8524-c52def1bcac3/dnsmasq-dns/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.852069 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2fc61e5-419c-4dab-9ddb-52bb9de855d5/glance-httpd/0.log" Feb 23 09:25:08 crc kubenswrapper[4626]: I0223 09:25:08.889767 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2fc61e5-419c-4dab-9ddb-52bb9de855d5/glance-log/0.log" Feb 23 09:25:09 crc kubenswrapper[4626]: I0223 09:25:09.054975 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7afc12b1-f684-47fc-bb2f-201f09707ad6/glance-httpd/0.log" Feb 23 09:25:09 crc kubenswrapper[4626]: I0223 09:25:09.122913 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7afc12b1-f684-47fc-bb2f-201f09707ad6/glance-log/0.log" Feb 23 09:25:10 crc kubenswrapper[4626]: I0223 09:25:10.189991 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5d7b45f997-g8dhd_e53266a3-8d3d-44af-b0f7-c48a7170ceac/heat-engine/0.log" Feb 23 09:25:10 crc kubenswrapper[4626]: I0223 09:25:10.481760 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-688bccf86-4crkw_d3e1e535-58de-4987-9d93-65fb6d4c9409/horizon/0.log" Feb 23 09:25:11 crc kubenswrapper[4626]: I0223 09:25:11.221488 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7f6ddcf74f-chh24_05f9d0eb-fafc-496f-9fe2-8923f9d8db61/heat-cfnapi/0.log" Feb 23 09:25:11 crc kubenswrapper[4626]: I0223 09:25:11.249104 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-54768cd758-d6bbc_bdb47037-af0c-4d21-9e61-53b65fb113d1/heat-api/0.log" Feb 23 09:25:11 crc kubenswrapper[4626]: I0223 09:25:11.286621 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8_3d1fa218-95df-487b-b4d0-be0da8e72c58/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:11 crc kubenswrapper[4626]: I0223 09:25:11.531992 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ch9l8_f5ac8c56-2109-41c7-8129-5561016dbaef/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:11 crc kubenswrapper[4626]: I0223 09:25:11.844615 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530501-tbhcl_4b61fdde-2209-4bd7-b3c1-a8f4123825a1/keystone-cron/0.log" Feb 23 09:25:12 crc kubenswrapper[4626]: I0223 09:25:12.143715 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-688bccf86-4crkw_d3e1e535-58de-4987-9d93-65fb6d4c9409/horizon-log/0.log" Feb 23 09:25:12 crc kubenswrapper[4626]: I0223 09:25:12.229392 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530561-fjctr_82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b/keystone-cron/0.log" Feb 23 09:25:12 crc kubenswrapper[4626]: I0223 09:25:12.340591 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530621-vb4gm_c637caea-745d-4de8-9111-addf155f30c3/keystone-cron/0.log" Feb 23 09:25:12 crc kubenswrapper[4626]: I0223 09:25:12.520428 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6b6610d6-48cf-4f86-ac4d-603b4bb60f04/kube-state-metrics/0.log" Feb 23 09:25:12 crc kubenswrapper[4626]: I0223 09:25:12.683576 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs_a9f59db3-8e35-432d-9dc1-bf70b5de9990/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:12 crc kubenswrapper[4626]: I0223 09:25:12.804066 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6ffd4ff45f-xttfr_9f0e3a3c-7106-4f7e-af92-a329a82fc625/keystone-api/0.log" Feb 23 09:25:13 crc kubenswrapper[4626]: I0223 09:25:13.234827 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-647d8d576f-jnrsm_f3b2c5c3-7e78-46b9-8365-396752a27b88/neutron-httpd/0.log" Feb 23 09:25:13 crc kubenswrapper[4626]: I0223 09:25:13.408299 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb_671613e8-e8c1-40e7-86bf-026acd3864fe/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:14 crc kubenswrapper[4626]: I0223 09:25:14.431949 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-647d8d576f-jnrsm_f3b2c5c3-7e78-46b9-8365-396752a27b88/neutron-api/0.log" Feb 23 09:25:14 crc kubenswrapper[4626]: I0223 09:25:14.688804 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e6972135-5165-4e01-9a21-591d7c07c533/nova-cell0-conductor-conductor/0.log" Feb 23 09:25:15 crc kubenswrapper[4626]: I0223 09:25:15.171221 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a8470cfd-1a2b-4e2b-b59a-10f5de602156/nova-cell1-conductor-conductor/0.log" Feb 23 09:25:15 crc kubenswrapper[4626]: I0223 09:25:15.653035 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9/nova-cell1-novncproxy-novncproxy/0.log" Feb 23 09:25:15 crc kubenswrapper[4626]: I0223 09:25:15.846109 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7wfjm_ba805f29-0d45-499e-bc08-00188c51379f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:16 crc kubenswrapper[4626]: I0223 09:25:16.293101 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f00ecb25-d721-43a6-810e-976c03a0572d/nova-metadata-log/0.log" Feb 23 09:25:16 crc kubenswrapper[4626]: I0223 09:25:16.921049 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e3af6bff-9179-4a10-aa63-4227c8933818/nova-api-log/0.log" Feb 23 09:25:17 crc kubenswrapper[4626]: I0223 09:25:17.598102 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03a2b658-d642-449c-be58-94b80484618e/mysql-bootstrap/0.log" Feb 23 09:25:17 crc kubenswrapper[4626]: I0223 09:25:17.798470 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03a2b658-d642-449c-be58-94b80484618e/mysql-bootstrap/0.log" Feb 23 09:25:17 crc kubenswrapper[4626]: I0223 09:25:17.844642 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_8dcf4a83-5fb6-41df-bf75-af6298f822e1/nova-scheduler-scheduler/0.log" Feb 23 09:25:18 crc kubenswrapper[4626]: I0223 09:25:18.096302 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03a2b658-d642-449c-be58-94b80484618e/galera/0.log" Feb 23 09:25:18 crc kubenswrapper[4626]: I0223 09:25:18.434520 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3b2ea075-8063-4feb-8e91-3160073129ff/mysql-bootstrap/0.log" Feb 23 09:25:18 crc kubenswrapper[4626]: I0223 09:25:18.652448 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3b2ea075-8063-4feb-8e91-3160073129ff/mysql-bootstrap/0.log" Feb 23 09:25:18 crc kubenswrapper[4626]: I0223 09:25:18.678032 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e3af6bff-9179-4a10-aa63-4227c8933818/nova-api-api/0.log" Feb 23 09:25:18 crc kubenswrapper[4626]: I0223 09:25:18.709622 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3b2ea075-8063-4feb-8e91-3160073129ff/galera/0.log" Feb 23 09:25:18 crc kubenswrapper[4626]: I0223 09:25:18.894640 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_36431530-ec45-4670-bc2c-ababbf867d6f/openstackclient/0.log" Feb 23 09:25:19 crc kubenswrapper[4626]: I0223 09:25:19.163956 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9j9vm_61db3f96-4a68-44bd-82ff-076ba32d9066/ovn-controller/0.log" Feb 23 09:25:19 crc kubenswrapper[4626]: I0223 09:25:19.193100 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k59cf_707342b7-0d1b-431f-98bb-99af693f57b2/openstack-network-exporter/0.log" Feb 23 09:25:19 crc kubenswrapper[4626]: I0223 09:25:19.511363 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovsdb-server-init/0.log" Feb 23 09:25:19 crc kubenswrapper[4626]: I0223 09:25:19.825003 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovsdb-server-init/0.log" Feb 23 09:25:19 crc kubenswrapper[4626]: I0223 09:25:19.826246 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovs-vswitchd/0.log" Feb 23 09:25:19 crc kubenswrapper[4626]: I0223 09:25:19.889280 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovsdb-server/0.log" Feb 23 09:25:20 crc kubenswrapper[4626]: I0223 09:25:20.502316 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-szt5h_97c11447-7070-4233-aaf2-7661d687049d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:20 crc kubenswrapper[4626]: I0223 09:25:20.711868 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9668927a-b529-4f44-a093-41260f069e34/openstack-network-exporter/0.log" Feb 23 09:25:20 crc kubenswrapper[4626]: I0223 09:25:20.749263 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9668927a-b529-4f44-a093-41260f069e34/ovn-northd/0.log" Feb 23 09:25:20 crc kubenswrapper[4626]: I0223 09:25:20.981617 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6b918ead-8c70-463f-b938-948436aa4278/openstack-network-exporter/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.142357 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6b918ead-8c70-463f-b938-948436aa4278/ovsdbserver-nb/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.273937 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f76a96be-520f-46e8-9e47-4a4d3237359e/openstack-network-exporter/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.386097 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f00ecb25-d721-43a6-810e-976c03a0572d/nova-metadata-metadata/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.501522 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_38df29c3-b467-498f-9ec1-a83cc91c27ca/memcached/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.531434 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f76a96be-520f-46e8-9e47-4a4d3237359e/ovsdbserver-sb/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.947694 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6cb97bcbf6-sl6hx_296f373f-42ac-474f-bc36-eab630843ed1/placement-api/0.log" Feb 23 09:25:21 crc kubenswrapper[4626]: I0223 09:25:21.998730 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3d9572dd-1a1c-4261-a7ba-7538d24d769a/setup-container/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.153661 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6cb97bcbf6-sl6hx_296f373f-42ac-474f-bc36-eab630843ed1/placement-log/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.168208 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3d9572dd-1a1c-4261-a7ba-7538d24d769a/setup-container/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.225230 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3d9572dd-1a1c-4261-a7ba-7538d24d769a/rabbitmq/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.303985 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_969fc15c-8289-4aa6-b590-9fa59f05783b/setup-container/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.546815 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_969fc15c-8289-4aa6-b590-9fa59f05783b/rabbitmq/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.590216 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk_80dffd5f-db5e-4946-9efe-8137bf36671f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.594663 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_969fc15c-8289-4aa6-b590-9fa59f05783b/setup-container/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.797350 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-cb2zm_1d465e63-5644-4732-a661-1134ffa03a78/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.903826 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c782w_5bd897ec-9ff1-4dc4-87c5-db910e8593e4/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:22 crc kubenswrapper[4626]: I0223 09:25:22.996884 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pxhls_8e11e133-311c-4bd4-9989-a0e05f665f6a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.147932 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-swrn8_28fc407d-b22d-432c-b5c1-7fbb18142e65/ssh-known-hosts-edpm-deployment/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.436733 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6465458495-hgsdz_434e199c-4e18-4274-bbaa-f81f2e2a697b/proxy-server/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.537608 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dkrrn_0f4f8444-d09b-4213-be5c-585c699d29ae/swift-ring-rebalance/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.557567 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6465458495-hgsdz_434e199c-4e18-4274-bbaa-f81f2e2a697b/proxy-httpd/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.718927 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-auditor/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.765846 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-reaper/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.816991 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-replicator/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.915349 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-server/0.log" Feb 23 09:25:23 crc kubenswrapper[4626]: I0223 09:25:23.936455 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-auditor/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.093299 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-replicator/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.140979 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-server/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.187282 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-updater/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.265859 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-expirer/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.296369 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-auditor/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.412600 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-replicator/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.441304 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-server/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.529804 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-updater/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.557871 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/swift-recon-cron/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.589093 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/rsync/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.795837 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr_a8761a5a-7aba-46e2-9070-49cc7e866c7b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:24 crc kubenswrapper[4626]: I0223 09:25:24.839915 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-multi-thread-testing_71976b26-a20d-4173-98a9-e4d5b553fb8b/tempest-tests-tempest-tests-runner/0.log" Feb 23 09:25:25 crc kubenswrapper[4626]: I0223 09:25:25.002788 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-thread-testing_4774892a-4776-4db1-b74e-d78df11aa97e/tempest-tests-tempest-tests-runner/0.log" Feb 23 09:25:25 crc kubenswrapper[4626]: I0223 09:25:25.511729 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d0209d94-e31f-4614-aa4d-e48ed971fcff/test-operator-logs-container/0.log" Feb 23 09:25:25 crc kubenswrapper[4626]: I0223 09:25:25.533809 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4_8d4a8ee4-c271-4fe2-b9ea-85a1a176a000/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.492379 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7nhbm"] Feb 23 09:25:49 crc kubenswrapper[4626]: E0223 09:25:49.493253 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08200018-1237-46b0-b1c2-2b3910b6fe44" containerName="container-00" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.493270 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="08200018-1237-46b0-b1c2-2b3910b6fe44" containerName="container-00" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.493531 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="08200018-1237-46b0-b1c2-2b3910b6fe44" containerName="container-00" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.496489 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.510442 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nhbm"] Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.532653 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkm7r\" (UniqueName: \"kubernetes.io/projected/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-kube-api-access-fkm7r\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.532796 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-utilities\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.532832 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-catalog-content\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.634630 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-utilities\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.634692 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-catalog-content\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.634774 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkm7r\" (UniqueName: \"kubernetes.io/projected/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-kube-api-access-fkm7r\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.635125 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-utilities\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.635423 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-catalog-content\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.666254 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkm7r\" (UniqueName: \"kubernetes.io/projected/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-kube-api-access-fkm7r\") pod \"redhat-marketplace-7nhbm\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:49 crc kubenswrapper[4626]: I0223 09:25:49.822302 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:50 crc kubenswrapper[4626]: I0223 09:25:50.310594 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nhbm"] Feb 23 09:25:50 crc kubenswrapper[4626]: I0223 09:25:50.615603 4626 generic.go:334] "Generic (PLEG): container finished" podID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerID="839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5" exitCode=0 Feb 23 09:25:50 crc kubenswrapper[4626]: I0223 09:25:50.615698 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerDied","Data":"839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5"} Feb 23 09:25:50 crc kubenswrapper[4626]: I0223 09:25:50.616032 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerStarted","Data":"1bbac1ec933dbdb2cf9ddcfed9eefc6cb8d2f618b0f54087421efb8bf809305f"} Feb 23 09:25:51 crc kubenswrapper[4626]: I0223 09:25:51.628038 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerStarted","Data":"ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544"} Feb 23 09:25:52 crc kubenswrapper[4626]: I0223 09:25:52.639380 4626 generic.go:334] "Generic (PLEG): container finished" podID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerID="ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544" exitCode=0 Feb 23 09:25:52 crc kubenswrapper[4626]: I0223 09:25:52.639554 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerDied","Data":"ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544"} Feb 23 09:25:53 crc kubenswrapper[4626]: I0223 09:25:53.652023 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerStarted","Data":"8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615"} Feb 23 09:25:53 crc kubenswrapper[4626]: I0223 09:25:53.675691 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7nhbm" podStartSLOduration=2.119365803 podStartE2EDuration="4.67567063s" podCreationTimestamp="2026-02-23 09:25:49 +0000 UTC" firstStartedPulling="2026-02-23 09:25:50.617438967 +0000 UTC m=+9902.956768233" lastFinishedPulling="2026-02-23 09:25:53.173743794 +0000 UTC m=+9905.513073060" observedRunningTime="2026-02-23 09:25:53.669190013 +0000 UTC m=+9906.008519279" watchObservedRunningTime="2026-02-23 09:25:53.67567063 +0000 UTC m=+9906.014999897" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.132117 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/util/0.log" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.347454 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/util/0.log" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.377097 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/pull/0.log" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.414765 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/pull/0.log" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.537177 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/util/0.log" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.567878 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/extract/0.log" Feb 23 09:25:57 crc kubenswrapper[4626]: I0223 09:25:57.607486 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/pull/0.log" Feb 23 09:25:58 crc kubenswrapper[4626]: I0223 09:25:58.425332 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-ljvjt_d2c9303d-be86-451a-8834-67abc679952b/manager/0.log" Feb 23 09:25:58 crc kubenswrapper[4626]: I0223 09:25:58.844278 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-6h2w7_805fd015-c983-4199-b12e-c0073d645e3b/manager/0.log" Feb 23 09:25:59 crc kubenswrapper[4626]: I0223 09:25:59.231090 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-jlcjb_984c81c1-d4bf-417c-9937-d5de29d33a00/manager/0.log" Feb 23 09:25:59 crc kubenswrapper[4626]: I0223 09:25:59.543970 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-dzl5c_35a1fc39-9675-4591-95a7-aa1ff016b779/manager/0.log" Feb 23 09:25:59 crc kubenswrapper[4626]: I0223 09:25:59.824034 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:59 crc kubenswrapper[4626]: I0223 09:25:59.825477 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:25:59 crc kubenswrapper[4626]: I0223 09:25:59.864101 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:26:00 crc kubenswrapper[4626]: I0223 09:26:00.278119 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-qqpl6_a05a1f00-3183-484d-9c35-7db986a84e8a/manager/0.log" Feb 23 09:26:00 crc kubenswrapper[4626]: I0223 09:26:00.417660 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-6qn7p_47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5/manager/0.log" Feb 23 09:26:00 crc kubenswrapper[4626]: I0223 09:26:00.805480 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:26:00 crc kubenswrapper[4626]: I0223 09:26:00.873593 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nhbm"] Feb 23 09:26:01 crc kubenswrapper[4626]: I0223 09:26:01.271560 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-zsrhb_682961f6-ea8c-4883-93e3-af65115c9507/manager/0.log" Feb 23 09:26:01 crc kubenswrapper[4626]: I0223 09:26:01.377357 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-g6gzq_8c4d609a-2d99-44b9-9c86-e20a3965381b/manager/0.log" Feb 23 09:26:01 crc kubenswrapper[4626]: I0223 09:26:01.621665 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-j524z_cfc5b69c-d190-4f73-a311-c9a371762530/manager/0.log" Feb 23 09:26:01 crc kubenswrapper[4626]: I0223 09:26:01.685897 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-r6gc2_4f909aac-ac8d-4ec5-8404-e9c1f77a144c/manager/0.log" Feb 23 09:26:01 crc kubenswrapper[4626]: I0223 09:26:01.866791 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-mnbrh_98f392cd-76ee-4062-8ad1-15608b3658dc/manager/0.log" Feb 23 09:26:02 crc kubenswrapper[4626]: I0223 09:26:02.047349 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-8nqw2_df31bf79-feee-4644-a1eb-bd6d5af05d7f/manager/0.log" Feb 23 09:26:02 crc kubenswrapper[4626]: I0223 09:26:02.621164 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75_d8a75041-f9ba-4691-9467-f20f9205daa6/manager/0.log" Feb 23 09:26:02 crc kubenswrapper[4626]: I0223 09:26:02.752621 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7nhbm" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="registry-server" containerID="cri-o://8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615" gracePeriod=2 Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.099584 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-vqprw_8e6ad56b-628d-40a0-b847-0e0b0040ad46/operator/0.log" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.609585 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.709363 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkm7r\" (UniqueName: \"kubernetes.io/projected/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-kube-api-access-fkm7r\") pod \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.709569 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-utilities\") pod \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.709675 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-catalog-content\") pod \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\" (UID: \"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971\") " Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.717719 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-utilities" (OuterVolumeSpecName: "utilities") pod "98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" (UID: "98bdd0d6-38b7-49ba-a639-d1c7fe4a8971"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.727492 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" (UID: "98bdd0d6-38b7-49ba-a639-d1c7fe4a8971"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.734340 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-kube-api-access-fkm7r" (OuterVolumeSpecName: "kube-api-access-fkm7r") pod "98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" (UID: "98bdd0d6-38b7-49ba-a639-d1c7fe4a8971"). InnerVolumeSpecName "kube-api-access-fkm7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.766795 4626 generic.go:334] "Generic (PLEG): container finished" podID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerID="8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615" exitCode=0 Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.766844 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerDied","Data":"8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615"} Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.766883 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7nhbm" event={"ID":"98bdd0d6-38b7-49ba-a639-d1c7fe4a8971","Type":"ContainerDied","Data":"1bbac1ec933dbdb2cf9ddcfed9eefc6cb8d2f618b0f54087421efb8bf809305f"} Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.766903 4626 scope.go:117] "RemoveContainer" containerID="8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.767088 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7nhbm" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.812679 4626 scope.go:117] "RemoveContainer" containerID="ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.816040 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.816273 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkm7r\" (UniqueName: \"kubernetes.io/projected/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-kube-api-access-fkm7r\") on node \"crc\" DevicePath \"\"" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.816533 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.821229 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dqtlp_0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7/registry-server/0.log" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.849903 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nhbm"] Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.849938 4626 scope.go:117] "RemoveContainer" containerID="839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.868411 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7nhbm"] Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.883452 4626 scope.go:117] "RemoveContainer" containerID="8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615" Feb 23 09:26:03 crc kubenswrapper[4626]: E0223 09:26:03.883930 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615\": container with ID starting with 8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615 not found: ID does not exist" containerID="8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.883976 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615"} err="failed to get container status \"8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615\": rpc error: code = NotFound desc = could not find container \"8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615\": container with ID starting with 8c1287eed7aabc889b3be35a1effbf8e935a1ef6358f9ced7188f29e31604615 not found: ID does not exist" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.884005 4626 scope.go:117] "RemoveContainer" containerID="ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544" Feb 23 09:26:03 crc kubenswrapper[4626]: E0223 09:26:03.885939 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544\": container with ID starting with ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544 not found: ID does not exist" containerID="ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.885972 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544"} err="failed to get container status \"ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544\": rpc error: code = NotFound desc = could not find container \"ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544\": container with ID starting with ef50a3b2b8ca4aa06851531da3856051bd92c9e912a5eeb41e7266b3b9e82544 not found: ID does not exist" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.885992 4626 scope.go:117] "RemoveContainer" containerID="839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5" Feb 23 09:26:03 crc kubenswrapper[4626]: E0223 09:26:03.886272 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5\": container with ID starting with 839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5 not found: ID does not exist" containerID="839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.886315 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5"} err="failed to get container status \"839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5\": rpc error: code = NotFound desc = could not find container \"839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5\": container with ID starting with 839cfbf6c259214a0025105e677af607e19c1d02e142c4a19f8c7839521652e5 not found: ID does not exist" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.907334 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-vxbs9_8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a/manager/0.log" Feb 23 09:26:03 crc kubenswrapper[4626]: I0223 09:26:03.992087 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" path="/var/lib/kubelet/pods/98bdd0d6-38b7-49ba-a639-d1c7fe4a8971/volumes" Feb 23 09:26:04 crc kubenswrapper[4626]: I0223 09:26:04.156276 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-db9df_1867cac1-c043-4677-9c64-786b1f261fd5/manager/0.log" Feb 23 09:26:04 crc kubenswrapper[4626]: I0223 09:26:04.427249 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vzwhb_02bcecce-5716-45c6-8d40-f1da91d26673/operator/0.log" Feb 23 09:26:04 crc kubenswrapper[4626]: I0223 09:26:04.792200 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-hv75x_7f639fb4-3160-4bd7-ab2a-86ea80cb51ed/manager/0.log" Feb 23 09:26:05 crc kubenswrapper[4626]: I0223 09:26:05.178190 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-vdx4s_6f3e393a-a659-44ae-bad8-6e4ff2d649ce/manager/0.log" Feb 23 09:26:05 crc kubenswrapper[4626]: I0223 09:26:05.497134 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-tzh2l_d9ef444e-448b-4576-960b-5861b7c19720/manager/0.log" Feb 23 09:26:05 crc kubenswrapper[4626]: I0223 09:26:05.672842 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-mf52c_fab598e9-890d-4f24-b26b-8f5b507a86c8/manager/0.log" Feb 23 09:26:05 crc kubenswrapper[4626]: I0223 09:26:05.840006 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-qfd84_cbbbb529-024e-4f9e-ad1c-063c63f39324/manager/0.log" Feb 23 09:26:06 crc kubenswrapper[4626]: I0223 09:26:06.615477 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-j9cqz_7e5b7475-eb3d-4c76-955d-0d9948cf2fe7/manager/0.log" Feb 23 09:26:13 crc kubenswrapper[4626]: I0223 09:26:13.563518 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-867lr_d20a75f5-68d0-4f32-bea9-62fdac3a3498/manager/0.log" Feb 23 09:26:25 crc kubenswrapper[4626]: I0223 09:26:25.686114 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:26:25 crc kubenswrapper[4626]: I0223 09:26:25.687677 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:26:29 crc kubenswrapper[4626]: I0223 09:26:29.114908 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-q9kk6_db82ca2b-5eac-4858-8808-7b6e22af0e26/control-plane-machine-set-operator/0.log" Feb 23 09:26:29 crc kubenswrapper[4626]: I0223 09:26:29.514419 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qnh2c_77d47a11-2a3a-4803-8f4b-3bfe07c27e00/kube-rbac-proxy/0.log" Feb 23 09:26:29 crc kubenswrapper[4626]: I0223 09:26:29.567977 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qnh2c_77d47a11-2a3a-4803-8f4b-3bfe07c27e00/machine-api-operator/0.log" Feb 23 09:26:44 crc kubenswrapper[4626]: I0223 09:26:44.308875 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-mlgd2_ab37c078-9524-4dd2-b8f3-450a17f5255d/cert-manager-controller/0.log" Feb 23 09:26:44 crc kubenswrapper[4626]: I0223 09:26:44.545339 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-r5bt4_c7d90dff-0264-49d6-9d9e-ed5063ee6976/cert-manager-cainjector/0.log" Feb 23 09:26:44 crc kubenswrapper[4626]: I0223 09:26:44.696525 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-66srh_4f172a38-4eaf-468a-99d8-99416128eef9/cert-manager-webhook/0.log" Feb 23 09:26:55 crc kubenswrapper[4626]: I0223 09:26:55.685868 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:26:55 crc kubenswrapper[4626]: I0223 09:26:55.686674 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:26:58 crc kubenswrapper[4626]: I0223 09:26:58.610467 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-s4n9r_ba7e56b0-e6c6-434c-9f46-c4526c1448f7/nmstate-console-plugin/0.log" Feb 23 09:26:58 crc kubenswrapper[4626]: I0223 09:26:58.797932 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6fccn_0b5fd810-1004-455d-ac3c-b7d5fc387861/nmstate-handler/0.log" Feb 23 09:26:58 crc kubenswrapper[4626]: I0223 09:26:58.881310 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-5dgzp_89312997-cb41-4e39-9fb3-bb07a7b5d7b6/nmstate-metrics/0.log" Feb 23 09:26:58 crc kubenswrapper[4626]: I0223 09:26:58.928716 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-5dgzp_89312997-cb41-4e39-9fb3-bb07a7b5d7b6/kube-rbac-proxy/0.log" Feb 23 09:26:59 crc kubenswrapper[4626]: I0223 09:26:59.133937 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-7j8mq_33276922-ab9c-4bb0-ad0a-71ca54766841/nmstate-operator/0.log" Feb 23 09:26:59 crc kubenswrapper[4626]: I0223 09:26:59.176852 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-pvm5b_fbb43c1e-8e84-47ab-8bca-d2b1fc06efce/nmstate-webhook/0.log" Feb 23 09:27:25 crc kubenswrapper[4626]: I0223 09:27:25.685609 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:27:25 crc kubenswrapper[4626]: I0223 09:27:25.686367 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:27:25 crc kubenswrapper[4626]: I0223 09:27:25.687124 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:27:25 crc kubenswrapper[4626]: I0223 09:27:25.688969 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:27:25 crc kubenswrapper[4626]: I0223 09:27:25.689607 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" gracePeriod=600 Feb 23 09:27:25 crc kubenswrapper[4626]: E0223 09:27:25.820407 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:27:26 crc kubenswrapper[4626]: I0223 09:27:26.573398 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" exitCode=0 Feb 23 09:27:26 crc kubenswrapper[4626]: I0223 09:27:26.573839 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0"} Feb 23 09:27:26 crc kubenswrapper[4626]: I0223 09:27:26.576088 4626 scope.go:117] "RemoveContainer" containerID="0e2319dd85ff58f417e9b19da900255d78470cf0b6837778a9207fb2e2656ffd" Feb 23 09:27:26 crc kubenswrapper[4626]: I0223 09:27:26.576233 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:27:26 crc kubenswrapper[4626]: E0223 09:27:26.576740 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:27:30 crc kubenswrapper[4626]: I0223 09:27:30.256736 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-94xt8_52957c0d-2568-459b-a83c-635bbd08c164/controller/0.log" Feb 23 09:27:30 crc kubenswrapper[4626]: I0223 09:27:30.282070 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-94xt8_52957c0d-2568-459b-a83c-635bbd08c164/kube-rbac-proxy/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.154393 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.352378 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.379514 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.399082 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.448221 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.640749 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.662922 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.666351 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.680216 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.905386 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.906860 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:27:31 crc kubenswrapper[4626]: I0223 09:27:31.956762 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.019675 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/controller/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.214938 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/frr-metrics/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.287006 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/kube-rbac-proxy/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.301357 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/kube-rbac-proxy-frr/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.585715 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-tvj6h_06339724-62db-4cc8-930b-5ce2572b46da/frr-k8s-webhook-server/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.605614 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/reloader/0.log" Feb 23 09:27:32 crc kubenswrapper[4626]: I0223 09:27:32.876972 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7df8f6cc8-jqkkq_5605b9df-eb52-4cb0-8f48-a273404aaf5d/manager/0.log" Feb 23 09:27:33 crc kubenswrapper[4626]: I0223 09:27:33.178763 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b49bc4cb8-dxzbr_e6756a3b-e295-453e-a3a6-1bc81275c97b/webhook-server/0.log" Feb 23 09:27:33 crc kubenswrapper[4626]: I0223 09:27:33.260263 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cnjgx_a86df0ed-d52d-4095-aed6-b298542a1c2e/kube-rbac-proxy/0.log" Feb 23 09:27:34 crc kubenswrapper[4626]: I0223 09:27:34.177455 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/frr/0.log" Feb 23 09:27:34 crc kubenswrapper[4626]: I0223 09:27:34.194623 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cnjgx_a86df0ed-d52d-4095-aed6-b298542a1c2e/speaker/0.log" Feb 23 09:27:39 crc kubenswrapper[4626]: I0223 09:27:39.982957 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:27:39 crc kubenswrapper[4626]: E0223 09:27:39.984087 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.024927 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/util/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.316198 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/pull/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.346696 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/util/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.362200 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/pull/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.588581 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/util/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.674069 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/pull/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.718465 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/extract/0.log" Feb 23 09:27:51 crc kubenswrapper[4626]: I0223 09:27:51.834163 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-utilities/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.100579 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-utilities/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.104385 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-content/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.185269 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-content/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.323035 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-utilities/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.358256 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-content/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.631720 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-utilities/0.log" Feb 23 09:27:52 crc kubenswrapper[4626]: I0223 09:27:52.867618 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-utilities/0.log" Feb 23 09:27:53 crc kubenswrapper[4626]: I0223 09:27:53.264943 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/registry-server/0.log" Feb 23 09:27:53 crc kubenswrapper[4626]: I0223 09:27:53.343877 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-content/0.log" Feb 23 09:27:53 crc kubenswrapper[4626]: I0223 09:27:53.369475 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-content/0.log" Feb 23 09:27:53 crc kubenswrapper[4626]: I0223 09:27:53.598719 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-utilities/0.log" Feb 23 09:27:53 crc kubenswrapper[4626]: I0223 09:27:53.622898 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-content/0.log" Feb 23 09:27:53 crc kubenswrapper[4626]: I0223 09:27:53.912708 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/util/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.283786 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/util/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.293855 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/pull/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.378473 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/pull/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.648624 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/extract/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.687547 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/util/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.709029 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/registry-server/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.809251 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/pull/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.962103 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-222dt_34eadfde-147c-470f-bf62-db7e15fbf337/marketplace-operator/0.log" Feb 23 09:27:54 crc kubenswrapper[4626]: I0223 09:27:54.981907 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:27:54 crc kubenswrapper[4626]: E0223 09:27:54.984978 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.121291 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-utilities/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.349582 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-content/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.382875 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-utilities/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.409441 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-content/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.621690 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-content/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.636664 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-utilities/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.640085 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-utilities/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.965778 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-content/0.log" Feb 23 09:27:55 crc kubenswrapper[4626]: I0223 09:27:55.971959 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/registry-server/0.log" Feb 23 09:27:56 crc kubenswrapper[4626]: I0223 09:27:56.035114 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-content/0.log" Feb 23 09:27:56 crc kubenswrapper[4626]: I0223 09:27:56.072119 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-utilities/0.log" Feb 23 09:27:56 crc kubenswrapper[4626]: I0223 09:27:56.229960 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-utilities/0.log" Feb 23 09:27:56 crc kubenswrapper[4626]: I0223 09:27:56.246370 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-content/0.log" Feb 23 09:27:57 crc kubenswrapper[4626]: I0223 09:27:57.165833 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/registry-server/0.log" Feb 23 09:28:07 crc kubenswrapper[4626]: I0223 09:28:07.988641 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:28:07 crc kubenswrapper[4626]: E0223 09:28:07.989490 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:28:20 crc kubenswrapper[4626]: I0223 09:28:20.982267 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:28:20 crc kubenswrapper[4626]: E0223 09:28:20.983014 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:28:31 crc kubenswrapper[4626]: I0223 09:28:31.982702 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:28:31 crc kubenswrapper[4626]: E0223 09:28:31.983994 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:28:44 crc kubenswrapper[4626]: I0223 09:28:44.982252 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:28:44 crc kubenswrapper[4626]: E0223 09:28:44.982959 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:28:57 crc kubenswrapper[4626]: I0223 09:28:57.990962 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:28:57 crc kubenswrapper[4626]: E0223 09:28:57.991942 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:29:08 crc kubenswrapper[4626]: I0223 09:29:08.981727 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:29:08 crc kubenswrapper[4626]: E0223 09:29:08.982599 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:29:23 crc kubenswrapper[4626]: I0223 09:29:23.983516 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:29:23 crc kubenswrapper[4626]: E0223 09:29:23.984706 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:29:25 crc kubenswrapper[4626]: I0223 09:29:25.479983 4626 scope.go:117] "RemoveContainer" containerID="f8f06f795715aa7419b6bdf49e31d621c609113784fcc8c0d9ff63e405387d3b" Feb 23 09:29:25 crc kubenswrapper[4626]: I0223 09:29:25.528021 4626 scope.go:117] "RemoveContainer" containerID="75e6d7ed485d60402b6a8164e1317c2bdc87024a000a0d8a793e7f7ad52f28c7" Feb 23 09:29:34 crc kubenswrapper[4626]: I0223 09:29:34.982843 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:29:34 crc kubenswrapper[4626]: E0223 09:29:34.983693 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:29:45 crc kubenswrapper[4626]: I0223 09:29:45.982358 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:29:45 crc kubenswrapper[4626]: E0223 09:29:45.983744 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.395209 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf"] Feb 23 09:30:00 crc kubenswrapper[4626]: E0223 09:30:00.398519 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="registry-server" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.398618 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="registry-server" Feb 23 09:30:00 crc kubenswrapper[4626]: E0223 09:30:00.398640 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="extract-utilities" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.398648 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="extract-utilities" Feb 23 09:30:00 crc kubenswrapper[4626]: E0223 09:30:00.398681 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="extract-content" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.398688 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="extract-content" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.402065 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bdd0d6-38b7-49ba-a639-d1c7fe4a8971" containerName="registry-server" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.405762 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.416470 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.418679 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.515051 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf"] Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.608187 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c573df0-1c57-4bba-a56a-1dc53492d95a-secret-volume\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.608404 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c573df0-1c57-4bba-a56a-1dc53492d95a-config-volume\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.608443 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjk4j\" (UniqueName: \"kubernetes.io/projected/6c573df0-1c57-4bba-a56a-1dc53492d95a-kube-api-access-xjk4j\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.711540 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c573df0-1c57-4bba-a56a-1dc53492d95a-secret-volume\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.711976 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c573df0-1c57-4bba-a56a-1dc53492d95a-config-volume\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.712002 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjk4j\" (UniqueName: \"kubernetes.io/projected/6c573df0-1c57-4bba-a56a-1dc53492d95a-kube-api-access-xjk4j\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.714702 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c573df0-1c57-4bba-a56a-1dc53492d95a-config-volume\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:00 crc kubenswrapper[4626]: I0223 09:30:00.982744 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:30:00 crc kubenswrapper[4626]: E0223 09:30:00.983167 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:30:01 crc kubenswrapper[4626]: I0223 09:30:01.308746 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjk4j\" (UniqueName: \"kubernetes.io/projected/6c573df0-1c57-4bba-a56a-1dc53492d95a-kube-api-access-xjk4j\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:01 crc kubenswrapper[4626]: I0223 09:30:01.309734 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c573df0-1c57-4bba-a56a-1dc53492d95a-secret-volume\") pod \"collect-profiles-29530650-8wkmf\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:01 crc kubenswrapper[4626]: I0223 09:30:01.334550 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:02 crc kubenswrapper[4626]: I0223 09:30:02.122947 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf"] Feb 23 09:30:02 crc kubenswrapper[4626]: I0223 09:30:02.212850 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" event={"ID":"6c573df0-1c57-4bba-a56a-1dc53492d95a","Type":"ContainerStarted","Data":"c9d30b8c9eab4eb7bf3d0793e056f45c15cc6863bdf7190bd6cb7cd360e43625"} Feb 23 09:30:03 crc kubenswrapper[4626]: I0223 09:30:03.228414 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" event={"ID":"6c573df0-1c57-4bba-a56a-1dc53492d95a","Type":"ContainerStarted","Data":"114858148e1031dfb50dca7d6fbabb53653603c04f4fb44904ee8e6a5cfa7e7b"} Feb 23 09:30:03 crc kubenswrapper[4626]: I0223 09:30:03.253170 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" podStartSLOduration=3.252332432 podStartE2EDuration="3.252332432s" podCreationTimestamp="2026-02-23 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:30:03.246199138 +0000 UTC m=+10155.585528404" watchObservedRunningTime="2026-02-23 09:30:03.252332432 +0000 UTC m=+10155.591661698" Feb 23 09:30:04 crc kubenswrapper[4626]: I0223 09:30:04.240858 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" event={"ID":"6c573df0-1c57-4bba-a56a-1dc53492d95a","Type":"ContainerDied","Data":"114858148e1031dfb50dca7d6fbabb53653603c04f4fb44904ee8e6a5cfa7e7b"} Feb 23 09:30:04 crc kubenswrapper[4626]: I0223 09:30:04.241277 4626 generic.go:334] "Generic (PLEG): container finished" podID="6c573df0-1c57-4bba-a56a-1dc53492d95a" containerID="114858148e1031dfb50dca7d6fbabb53653603c04f4fb44904ee8e6a5cfa7e7b" exitCode=0 Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.550431 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.649099 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c573df0-1c57-4bba-a56a-1dc53492d95a-secret-volume\") pod \"6c573df0-1c57-4bba-a56a-1dc53492d95a\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.649309 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjk4j\" (UniqueName: \"kubernetes.io/projected/6c573df0-1c57-4bba-a56a-1dc53492d95a-kube-api-access-xjk4j\") pod \"6c573df0-1c57-4bba-a56a-1dc53492d95a\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.649376 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c573df0-1c57-4bba-a56a-1dc53492d95a-config-volume\") pod \"6c573df0-1c57-4bba-a56a-1dc53492d95a\" (UID: \"6c573df0-1c57-4bba-a56a-1dc53492d95a\") " Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.651280 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c573df0-1c57-4bba-a56a-1dc53492d95a-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c573df0-1c57-4bba-a56a-1dc53492d95a" (UID: "6c573df0-1c57-4bba-a56a-1dc53492d95a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.657046 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c573df0-1c57-4bba-a56a-1dc53492d95a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c573df0-1c57-4bba-a56a-1dc53492d95a" (UID: "6c573df0-1c57-4bba-a56a-1dc53492d95a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.657950 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c573df0-1c57-4bba-a56a-1dc53492d95a-kube-api-access-xjk4j" (OuterVolumeSpecName: "kube-api-access-xjk4j") pod "6c573df0-1c57-4bba-a56a-1dc53492d95a" (UID: "6c573df0-1c57-4bba-a56a-1dc53492d95a"). InnerVolumeSpecName "kube-api-access-xjk4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.751220 4626 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c573df0-1c57-4bba-a56a-1dc53492d95a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.751570 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjk4j\" (UniqueName: \"kubernetes.io/projected/6c573df0-1c57-4bba-a56a-1dc53492d95a-kube-api-access-xjk4j\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:05 crc kubenswrapper[4626]: I0223 09:30:05.751584 4626 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c573df0-1c57-4bba-a56a-1dc53492d95a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:06 crc kubenswrapper[4626]: I0223 09:30:06.266834 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" event={"ID":"6c573df0-1c57-4bba-a56a-1dc53492d95a","Type":"ContainerDied","Data":"c9d30b8c9eab4eb7bf3d0793e056f45c15cc6863bdf7190bd6cb7cd360e43625"} Feb 23 09:30:06 crc kubenswrapper[4626]: I0223 09:30:06.267277 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-8wkmf" Feb 23 09:30:06 crc kubenswrapper[4626]: I0223 09:30:06.267423 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9d30b8c9eab4eb7bf3d0793e056f45c15cc6863bdf7190bd6cb7cd360e43625" Feb 23 09:30:06 crc kubenswrapper[4626]: I0223 09:30:06.335807 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh"] Feb 23 09:30:06 crc kubenswrapper[4626]: I0223 09:30:06.379484 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-hddgh"] Feb 23 09:30:07 crc kubenswrapper[4626]: I0223 09:30:07.997319 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6897aa9-e088-451a-99bb-d45572959840" path="/var/lib/kubelet/pods/f6897aa9-e088-451a-99bb-d45572959840/volumes" Feb 23 09:30:12 crc kubenswrapper[4626]: I0223 09:30:12.982981 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:30:12 crc kubenswrapper[4626]: E0223 09:30:12.984098 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:30:23 crc kubenswrapper[4626]: I0223 09:30:23.467930 4626 generic.go:334] "Generic (PLEG): container finished" podID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerID="226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698" exitCode=0 Feb 23 09:30:23 crc kubenswrapper[4626]: I0223 09:30:23.468017 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" event={"ID":"0529f017-1e22-4c12-a8c8-c89be8726d3e","Type":"ContainerDied","Data":"226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698"} Feb 23 09:30:23 crc kubenswrapper[4626]: I0223 09:30:23.469792 4626 scope.go:117] "RemoveContainer" containerID="226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698" Feb 23 09:30:23 crc kubenswrapper[4626]: I0223 09:30:23.856692 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vhk9d_must-gather-5dwvn_0529f017-1e22-4c12-a8c8-c89be8726d3e/gather/0.log" Feb 23 09:30:25 crc kubenswrapper[4626]: I0223 09:30:25.565451 4626 scope.go:117] "RemoveContainer" containerID="3301dee6e19716bddef6caa193622a2ef2a02bf57cfce426ae949dc9b948fc08" Feb 23 09:30:25 crc kubenswrapper[4626]: I0223 09:30:25.593262 4626 scope.go:117] "RemoveContainer" containerID="d4ddd2b4f049a13ba2d4b801edbf3a0f706cf5ba2678d8210047de1153a9b840" Feb 23 09:30:25 crc kubenswrapper[4626]: I0223 09:30:25.620042 4626 scope.go:117] "RemoveContainer" containerID="24f55ce3e1acda781f196e9163e82f87d89bef681187e82302878fa911662276" Feb 23 09:30:27 crc kubenswrapper[4626]: I0223 09:30:27.990889 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:30:27 crc kubenswrapper[4626]: E0223 09:30:27.992421 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.458966 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vhk9d/must-gather-5dwvn"] Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.461716 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="copy" containerID="cri-o://093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58" gracePeriod=2 Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.469330 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vhk9d/must-gather-5dwvn"] Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.835367 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vhk9d_must-gather-5dwvn_0529f017-1e22-4c12-a8c8-c89be8726d3e/copy/0.log" Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.836301 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.996389 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0529f017-1e22-4c12-a8c8-c89be8726d3e-must-gather-output\") pod \"0529f017-1e22-4c12-a8c8-c89be8726d3e\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " Feb 23 09:30:33 crc kubenswrapper[4626]: I0223 09:30:33.996870 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrkdf\" (UniqueName: \"kubernetes.io/projected/0529f017-1e22-4c12-a8c8-c89be8726d3e-kube-api-access-qrkdf\") pod \"0529f017-1e22-4c12-a8c8-c89be8726d3e\" (UID: \"0529f017-1e22-4c12-a8c8-c89be8726d3e\") " Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.011085 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0529f017-1e22-4c12-a8c8-c89be8726d3e-kube-api-access-qrkdf" (OuterVolumeSpecName: "kube-api-access-qrkdf") pod "0529f017-1e22-4c12-a8c8-c89be8726d3e" (UID: "0529f017-1e22-4c12-a8c8-c89be8726d3e"). InnerVolumeSpecName "kube-api-access-qrkdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.102349 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrkdf\" (UniqueName: \"kubernetes.io/projected/0529f017-1e22-4c12-a8c8-c89be8726d3e-kube-api-access-qrkdf\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.177661 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0529f017-1e22-4c12-a8c8-c89be8726d3e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0529f017-1e22-4c12-a8c8-c89be8726d3e" (UID: "0529f017-1e22-4c12-a8c8-c89be8726d3e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.205371 4626 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0529f017-1e22-4c12-a8c8-c89be8726d3e-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.582548 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vhk9d_must-gather-5dwvn_0529f017-1e22-4c12-a8c8-c89be8726d3e/copy/0.log" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.583251 4626 generic.go:334] "Generic (PLEG): container finished" podID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerID="093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58" exitCode=143 Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.583319 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vhk9d/must-gather-5dwvn" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.583328 4626 scope.go:117] "RemoveContainer" containerID="093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.613714 4626 scope.go:117] "RemoveContainer" containerID="226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.675400 4626 scope.go:117] "RemoveContainer" containerID="093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58" Feb 23 09:30:34 crc kubenswrapper[4626]: E0223 09:30:34.676608 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58\": container with ID starting with 093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58 not found: ID does not exist" containerID="093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.676664 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58"} err="failed to get container status \"093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58\": rpc error: code = NotFound desc = could not find container \"093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58\": container with ID starting with 093bd6180500baba06009cc268486ce6a23c8ffb816c09b852f39f1fe2463f58 not found: ID does not exist" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.676694 4626 scope.go:117] "RemoveContainer" containerID="226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698" Feb 23 09:30:34 crc kubenswrapper[4626]: E0223 09:30:34.677127 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698\": container with ID starting with 226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698 not found: ID does not exist" containerID="226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698" Feb 23 09:30:34 crc kubenswrapper[4626]: I0223 09:30:34.677169 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698"} err="failed to get container status \"226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698\": rpc error: code = NotFound desc = could not find container \"226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698\": container with ID starting with 226b2f7d13d87a29bc908f8f5f624a9af976c1eb39cac2643e82109f4f16d698 not found: ID does not exist" Feb 23 09:30:35 crc kubenswrapper[4626]: I0223 09:30:35.994074 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" path="/var/lib/kubelet/pods/0529f017-1e22-4c12-a8c8-c89be8726d3e/volumes" Feb 23 09:30:41 crc kubenswrapper[4626]: I0223 09:30:41.982308 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:30:41 crc kubenswrapper[4626]: E0223 09:30:41.983385 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:30:52 crc kubenswrapper[4626]: I0223 09:30:52.982570 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:30:52 crc kubenswrapper[4626]: E0223 09:30:52.983475 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:31:03 crc kubenswrapper[4626]: I0223 09:31:03.983269 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:31:03 crc kubenswrapper[4626]: E0223 09:31:03.984547 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:31:18 crc kubenswrapper[4626]: I0223 09:31:18.982171 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:31:18 crc kubenswrapper[4626]: E0223 09:31:18.983283 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:31:25 crc kubenswrapper[4626]: I0223 09:31:25.750232 4626 scope.go:117] "RemoveContainer" containerID="ee57486ae36ab510e7e4f0930c1178347a7db1a0254fb52905789958f8fee157" Feb 23 09:31:31 crc kubenswrapper[4626]: I0223 09:31:31.982035 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:31:31 crc kubenswrapper[4626]: E0223 09:31:31.982948 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:31:45 crc kubenswrapper[4626]: I0223 09:31:45.982481 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:31:45 crc kubenswrapper[4626]: E0223 09:31:45.983627 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:31:58 crc kubenswrapper[4626]: I0223 09:31:58.983400 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:31:58 crc kubenswrapper[4626]: E0223 09:31:58.985074 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:32:11 crc kubenswrapper[4626]: I0223 09:32:11.982279 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:32:11 crc kubenswrapper[4626]: E0223 09:32:11.984095 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.666902 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f4swc/must-gather-jj8rt"] Feb 23 09:32:23 crc kubenswrapper[4626]: E0223 09:32:23.668037 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c573df0-1c57-4bba-a56a-1dc53492d95a" containerName="collect-profiles" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.668052 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c573df0-1c57-4bba-a56a-1dc53492d95a" containerName="collect-profiles" Feb 23 09:32:23 crc kubenswrapper[4626]: E0223 09:32:23.668069 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="copy" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.668075 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="copy" Feb 23 09:32:23 crc kubenswrapper[4626]: E0223 09:32:23.668085 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="gather" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.668091 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="gather" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.668328 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="copy" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.668339 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="0529f017-1e22-4c12-a8c8-c89be8726d3e" containerName="gather" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.668350 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c573df0-1c57-4bba-a56a-1dc53492d95a" containerName="collect-profiles" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.669382 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.679750 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f4swc/must-gather-jj8rt"] Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.680191 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f4swc"/"kube-root-ca.crt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.680190 4626 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f4swc"/"openshift-service-ca.crt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.848990 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mt77\" (UniqueName: \"kubernetes.io/projected/428281f6-5309-4718-aeec-82bdc3d2bf08-kube-api-access-4mt77\") pod \"must-gather-jj8rt\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.850283 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/428281f6-5309-4718-aeec-82bdc3d2bf08-must-gather-output\") pod \"must-gather-jj8rt\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.953521 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mt77\" (UniqueName: \"kubernetes.io/projected/428281f6-5309-4718-aeec-82bdc3d2bf08-kube-api-access-4mt77\") pod \"must-gather-jj8rt\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.953720 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/428281f6-5309-4718-aeec-82bdc3d2bf08-must-gather-output\") pod \"must-gather-jj8rt\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.954219 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/428281f6-5309-4718-aeec-82bdc3d2bf08-must-gather-output\") pod \"must-gather-jj8rt\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:23 crc kubenswrapper[4626]: I0223 09:32:23.987624 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mt77\" (UniqueName: \"kubernetes.io/projected/428281f6-5309-4718-aeec-82bdc3d2bf08-kube-api-access-4mt77\") pod \"must-gather-jj8rt\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:24 crc kubenswrapper[4626]: I0223 09:32:24.005759 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:32:24 crc kubenswrapper[4626]: I0223 09:32:24.264976 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f4swc/must-gather-jj8rt"] Feb 23 09:32:24 crc kubenswrapper[4626]: I0223 09:32:24.560345 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/must-gather-jj8rt" event={"ID":"428281f6-5309-4718-aeec-82bdc3d2bf08","Type":"ContainerStarted","Data":"8b3ca2adab34830ca843769432badcf9faf2b8381b97a5254784f5af19a803d8"} Feb 23 09:32:24 crc kubenswrapper[4626]: I0223 09:32:24.983785 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:32:24 crc kubenswrapper[4626]: E0223 09:32:24.990273 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:32:25 crc kubenswrapper[4626]: I0223 09:32:25.571275 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/must-gather-jj8rt" event={"ID":"428281f6-5309-4718-aeec-82bdc3d2bf08","Type":"ContainerStarted","Data":"c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645"} Feb 23 09:32:25 crc kubenswrapper[4626]: I0223 09:32:25.571355 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/must-gather-jj8rt" event={"ID":"428281f6-5309-4718-aeec-82bdc3d2bf08","Type":"ContainerStarted","Data":"b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908"} Feb 23 09:32:25 crc kubenswrapper[4626]: I0223 09:32:25.586922 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f4swc/must-gather-jj8rt" podStartSLOduration=2.586903626 podStartE2EDuration="2.586903626s" podCreationTimestamp="2026-02-23 09:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:32:25.58296464 +0000 UTC m=+10297.922293907" watchObservedRunningTime="2026-02-23 09:32:25.586903626 +0000 UTC m=+10297.926232892" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.533931 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f4swc/crc-debug-nt88w"] Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.535623 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.538898 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f4swc"/"default-dockercfg-k55j7" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.569377 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq987\" (UniqueName: \"kubernetes.io/projected/ff310b9d-d617-48d8-b8ba-1cdab35e4205-kube-api-access-hq987\") pod \"crc-debug-nt88w\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.569939 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff310b9d-d617-48d8-b8ba-1cdab35e4205-host\") pod \"crc-debug-nt88w\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.671399 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq987\" (UniqueName: \"kubernetes.io/projected/ff310b9d-d617-48d8-b8ba-1cdab35e4205-kube-api-access-hq987\") pod \"crc-debug-nt88w\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.671843 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff310b9d-d617-48d8-b8ba-1cdab35e4205-host\") pod \"crc-debug-nt88w\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.673022 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff310b9d-d617-48d8-b8ba-1cdab35e4205-host\") pod \"crc-debug-nt88w\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.690943 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq987\" (UniqueName: \"kubernetes.io/projected/ff310b9d-d617-48d8-b8ba-1cdab35e4205-kube-api-access-hq987\") pod \"crc-debug-nt88w\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:28 crc kubenswrapper[4626]: I0223 09:32:28.861696 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:32:29 crc kubenswrapper[4626]: I0223 09:32:29.613019 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-nt88w" event={"ID":"ff310b9d-d617-48d8-b8ba-1cdab35e4205","Type":"ContainerStarted","Data":"588f1b97cf298bba18e8b70a507ad6d372416aeac5f49dcf4e5e107b4c82e687"} Feb 23 09:32:29 crc kubenswrapper[4626]: I0223 09:32:29.613624 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-nt88w" event={"ID":"ff310b9d-d617-48d8-b8ba-1cdab35e4205","Type":"ContainerStarted","Data":"ccc0a9365b1b867e9c8a6859b717333b0ed139ec3250ff3aed82c611b9cfb4ca"} Feb 23 09:32:29 crc kubenswrapper[4626]: I0223 09:32:29.634025 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f4swc/crc-debug-nt88w" podStartSLOduration=1.63399532 podStartE2EDuration="1.63399532s" podCreationTimestamp="2026-02-23 09:32:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:32:29.628641918 +0000 UTC m=+10301.967971184" watchObservedRunningTime="2026-02-23 09:32:29.63399532 +0000 UTC m=+10301.973324586" Feb 23 09:32:39 crc kubenswrapper[4626]: I0223 09:32:39.982286 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:32:40 crc kubenswrapper[4626]: I0223 09:32:40.714464 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"b9b2b1dc10e03fbf1190e5ec1f19a1cd0f71f9331cc439eaba9e224b7d34e112"} Feb 23 09:33:07 crc kubenswrapper[4626]: I0223 09:33:07.962338 4626 generic.go:334] "Generic (PLEG): container finished" podID="ff310b9d-d617-48d8-b8ba-1cdab35e4205" containerID="588f1b97cf298bba18e8b70a507ad6d372416aeac5f49dcf4e5e107b4c82e687" exitCode=0 Feb 23 09:33:07 crc kubenswrapper[4626]: I0223 09:33:07.963715 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-nt88w" event={"ID":"ff310b9d-d617-48d8-b8ba-1cdab35e4205","Type":"ContainerDied","Data":"588f1b97cf298bba18e8b70a507ad6d372416aeac5f49dcf4e5e107b4c82e687"} Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.063161 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.111105 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f4swc/crc-debug-nt88w"] Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.122483 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f4swc/crc-debug-nt88w"] Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.146546 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff310b9d-d617-48d8-b8ba-1cdab35e4205-host\") pod \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.146679 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff310b9d-d617-48d8-b8ba-1cdab35e4205-host" (OuterVolumeSpecName: "host") pod "ff310b9d-d617-48d8-b8ba-1cdab35e4205" (UID: "ff310b9d-d617-48d8-b8ba-1cdab35e4205"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.146702 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq987\" (UniqueName: \"kubernetes.io/projected/ff310b9d-d617-48d8-b8ba-1cdab35e4205-kube-api-access-hq987\") pod \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\" (UID: \"ff310b9d-d617-48d8-b8ba-1cdab35e4205\") " Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.148360 4626 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff310b9d-d617-48d8-b8ba-1cdab35e4205-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.155354 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff310b9d-d617-48d8-b8ba-1cdab35e4205-kube-api-access-hq987" (OuterVolumeSpecName: "kube-api-access-hq987") pod "ff310b9d-d617-48d8-b8ba-1cdab35e4205" (UID: "ff310b9d-d617-48d8-b8ba-1cdab35e4205"). InnerVolumeSpecName "kube-api-access-hq987". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.250643 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq987\" (UniqueName: \"kubernetes.io/projected/ff310b9d-d617-48d8-b8ba-1cdab35e4205-kube-api-access-hq987\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.986573 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-nt88w" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.993572 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff310b9d-d617-48d8-b8ba-1cdab35e4205" path="/var/lib/kubelet/pods/ff310b9d-d617-48d8-b8ba-1cdab35e4205/volumes" Feb 23 09:33:09 crc kubenswrapper[4626]: I0223 09:33:09.995995 4626 scope.go:117] "RemoveContainer" containerID="588f1b97cf298bba18e8b70a507ad6d372416aeac5f49dcf4e5e107b4c82e687" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.300813 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f4swc/crc-debug-v7j8h"] Feb 23 09:33:10 crc kubenswrapper[4626]: E0223 09:33:10.301379 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff310b9d-d617-48d8-b8ba-1cdab35e4205" containerName="container-00" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.301398 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff310b9d-d617-48d8-b8ba-1cdab35e4205" containerName="container-00" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.301680 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff310b9d-d617-48d8-b8ba-1cdab35e4205" containerName="container-00" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.302605 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.305694 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f4swc"/"default-dockercfg-k55j7" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.375211 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rm5x\" (UniqueName: \"kubernetes.io/projected/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-kube-api-access-9rm5x\") pod \"crc-debug-v7j8h\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.375903 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-host\") pod \"crc-debug-v7j8h\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.478279 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-host\") pod \"crc-debug-v7j8h\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.478397 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-host\") pod \"crc-debug-v7j8h\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.478655 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rm5x\" (UniqueName: \"kubernetes.io/projected/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-kube-api-access-9rm5x\") pod \"crc-debug-v7j8h\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.499023 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rm5x\" (UniqueName: \"kubernetes.io/projected/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-kube-api-access-9rm5x\") pod \"crc-debug-v7j8h\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.618673 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.995658 4626 generic.go:334] "Generic (PLEG): container finished" podID="9aa95451-a629-4ff1-8fe9-ae51dcd09d09" containerID="a41d3286d41d2b71fab167aa7b644e39c28af259e30b0c3db405ac0f40424f39" exitCode=0 Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.995980 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" event={"ID":"9aa95451-a629-4ff1-8fe9-ae51dcd09d09","Type":"ContainerDied","Data":"a41d3286d41d2b71fab167aa7b644e39c28af259e30b0c3db405ac0f40424f39"} Feb 23 09:33:10 crc kubenswrapper[4626]: I0223 09:33:10.996014 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" event={"ID":"9aa95451-a629-4ff1-8fe9-ae51dcd09d09","Type":"ContainerStarted","Data":"2b9e31be33b4511dff19a00a05dba89b7597b78d20c7398901fcd17a1b7554c2"} Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.106319 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.109057 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-host\") pod \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.109251 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rm5x\" (UniqueName: \"kubernetes.io/projected/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-kube-api-access-9rm5x\") pod \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\" (UID: \"9aa95451-a629-4ff1-8fe9-ae51dcd09d09\") " Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.112293 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-host" (OuterVolumeSpecName: "host") pod "9aa95451-a629-4ff1-8fe9-ae51dcd09d09" (UID: "9aa95451-a629-4ff1-8fe9-ae51dcd09d09"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.118553 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-kube-api-access-9rm5x" (OuterVolumeSpecName: "kube-api-access-9rm5x") pod "9aa95451-a629-4ff1-8fe9-ae51dcd09d09" (UID: "9aa95451-a629-4ff1-8fe9-ae51dcd09d09"). InnerVolumeSpecName "kube-api-access-9rm5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.211791 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rm5x\" (UniqueName: \"kubernetes.io/projected/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-kube-api-access-9rm5x\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:12 crc kubenswrapper[4626]: I0223 09:33:12.211828 4626 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9aa95451-a629-4ff1-8fe9-ae51dcd09d09-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:13 crc kubenswrapper[4626]: I0223 09:33:13.056764 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" event={"ID":"9aa95451-a629-4ff1-8fe9-ae51dcd09d09","Type":"ContainerDied","Data":"2b9e31be33b4511dff19a00a05dba89b7597b78d20c7398901fcd17a1b7554c2"} Feb 23 09:33:13 crc kubenswrapper[4626]: I0223 09:33:13.057213 4626 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b9e31be33b4511dff19a00a05dba89b7597b78d20c7398901fcd17a1b7554c2" Feb 23 09:33:13 crc kubenswrapper[4626]: I0223 09:33:13.057291 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-v7j8h" Feb 23 09:33:13 crc kubenswrapper[4626]: I0223 09:33:13.142804 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f4swc/crc-debug-v7j8h"] Feb 23 09:33:13 crc kubenswrapper[4626]: I0223 09:33:13.154896 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f4swc/crc-debug-v7j8h"] Feb 23 09:33:13 crc kubenswrapper[4626]: I0223 09:33:13.990324 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa95451-a629-4ff1-8fe9-ae51dcd09d09" path="/var/lib/kubelet/pods/9aa95451-a629-4ff1-8fe9-ae51dcd09d09/volumes" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.445810 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f4swc/crc-debug-57zw6"] Feb 23 09:33:14 crc kubenswrapper[4626]: E0223 09:33:14.446432 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa95451-a629-4ff1-8fe9-ae51dcd09d09" containerName="container-00" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.446447 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa95451-a629-4ff1-8fe9-ae51dcd09d09" containerName="container-00" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.446653 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa95451-a629-4ff1-8fe9-ae51dcd09d09" containerName="container-00" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.447287 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.449268 4626 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f4swc"/"default-dockercfg-k55j7" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.477383 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7a893f-a71a-4997-8e10-8ffbe098d024-host\") pod \"crc-debug-57zw6\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.477591 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhpr\" (UniqueName: \"kubernetes.io/projected/3f7a893f-a71a-4997-8e10-8ffbe098d024-kube-api-access-pkhpr\") pod \"crc-debug-57zw6\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.581408 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7a893f-a71a-4997-8e10-8ffbe098d024-host\") pod \"crc-debug-57zw6\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.581617 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhpr\" (UniqueName: \"kubernetes.io/projected/3f7a893f-a71a-4997-8e10-8ffbe098d024-kube-api-access-pkhpr\") pod \"crc-debug-57zw6\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.581864 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7a893f-a71a-4997-8e10-8ffbe098d024-host\") pod \"crc-debug-57zw6\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.602120 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhpr\" (UniqueName: \"kubernetes.io/projected/3f7a893f-a71a-4997-8e10-8ffbe098d024-kube-api-access-pkhpr\") pod \"crc-debug-57zw6\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:14 crc kubenswrapper[4626]: I0223 09:33:14.764353 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:15 crc kubenswrapper[4626]: I0223 09:33:15.079668 4626 generic.go:334] "Generic (PLEG): container finished" podID="3f7a893f-a71a-4997-8e10-8ffbe098d024" containerID="850018e84d472e0346c7c48cc607640d6f43d3eea0377a1169c8569cee124a45" exitCode=0 Feb 23 09:33:15 crc kubenswrapper[4626]: I0223 09:33:15.079713 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-57zw6" event={"ID":"3f7a893f-a71a-4997-8e10-8ffbe098d024","Type":"ContainerDied","Data":"850018e84d472e0346c7c48cc607640d6f43d3eea0377a1169c8569cee124a45"} Feb 23 09:33:15 crc kubenswrapper[4626]: I0223 09:33:15.080076 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/crc-debug-57zw6" event={"ID":"3f7a893f-a71a-4997-8e10-8ffbe098d024","Type":"ContainerStarted","Data":"0ea397ca2584ed9e6f202b798be0ab72c88da1280ecaefd239014ee6cd6803c0"} Feb 23 09:33:15 crc kubenswrapper[4626]: I0223 09:33:15.127702 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f4swc/crc-debug-57zw6"] Feb 23 09:33:15 crc kubenswrapper[4626]: I0223 09:33:15.135737 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f4swc/crc-debug-57zw6"] Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.176208 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.216400 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkhpr\" (UniqueName: \"kubernetes.io/projected/3f7a893f-a71a-4997-8e10-8ffbe098d024-kube-api-access-pkhpr\") pod \"3f7a893f-a71a-4997-8e10-8ffbe098d024\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.216477 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7a893f-a71a-4997-8e10-8ffbe098d024-host\") pod \"3f7a893f-a71a-4997-8e10-8ffbe098d024\" (UID: \"3f7a893f-a71a-4997-8e10-8ffbe098d024\") " Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.216612 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f7a893f-a71a-4997-8e10-8ffbe098d024-host" (OuterVolumeSpecName: "host") pod "3f7a893f-a71a-4997-8e10-8ffbe098d024" (UID: "3f7a893f-a71a-4997-8e10-8ffbe098d024"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.217905 4626 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f7a893f-a71a-4997-8e10-8ffbe098d024-host\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.222871 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7a893f-a71a-4997-8e10-8ffbe098d024-kube-api-access-pkhpr" (OuterVolumeSpecName: "kube-api-access-pkhpr") pod "3f7a893f-a71a-4997-8e10-8ffbe098d024" (UID: "3f7a893f-a71a-4997-8e10-8ffbe098d024"). InnerVolumeSpecName "kube-api-access-pkhpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:33:16 crc kubenswrapper[4626]: I0223 09:33:16.319409 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkhpr\" (UniqueName: \"kubernetes.io/projected/3f7a893f-a71a-4997-8e10-8ffbe098d024-kube-api-access-pkhpr\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:17 crc kubenswrapper[4626]: I0223 09:33:17.096681 4626 scope.go:117] "RemoveContainer" containerID="850018e84d472e0346c7c48cc607640d6f43d3eea0377a1169c8569cee124a45" Feb 23 09:33:17 crc kubenswrapper[4626]: I0223 09:33:17.096815 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/crc-debug-57zw6" Feb 23 09:33:17 crc kubenswrapper[4626]: I0223 09:33:17.994161 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7a893f-a71a-4997-8e10-8ffbe098d024" path="/var/lib/kubelet/pods/3f7a893f-a71a-4997-8e10-8ffbe098d024/volumes" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.193162 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k2g6m"] Feb 23 09:33:47 crc kubenswrapper[4626]: E0223 09:33:47.195673 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7a893f-a71a-4997-8e10-8ffbe098d024" containerName="container-00" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.195783 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7a893f-a71a-4997-8e10-8ffbe098d024" containerName="container-00" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.196147 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7a893f-a71a-4997-8e10-8ffbe098d024" containerName="container-00" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.198143 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.206168 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2g6m"] Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.335532 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljlx2\" (UniqueName: \"kubernetes.io/projected/c2342849-5802-4577-b127-727ec70c3362-kube-api-access-ljlx2\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.335686 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-catalog-content\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.335914 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-utilities\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.438792 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-utilities\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.439031 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljlx2\" (UniqueName: \"kubernetes.io/projected/c2342849-5802-4577-b127-727ec70c3362-kube-api-access-ljlx2\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.439267 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-catalog-content\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.439399 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-utilities\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.439786 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-catalog-content\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.457819 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljlx2\" (UniqueName: \"kubernetes.io/projected/c2342849-5802-4577-b127-727ec70c3362-kube-api-access-ljlx2\") pod \"redhat-operators-k2g6m\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:47 crc kubenswrapper[4626]: I0223 09:33:47.520032 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:48 crc kubenswrapper[4626]: I0223 09:33:48.016811 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2g6m"] Feb 23 09:33:48 crc kubenswrapper[4626]: I0223 09:33:48.431611 4626 generic.go:334] "Generic (PLEG): container finished" podID="c2342849-5802-4577-b127-727ec70c3362" containerID="0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720" exitCode=0 Feb 23 09:33:48 crc kubenswrapper[4626]: I0223 09:33:48.431714 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerDied","Data":"0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720"} Feb 23 09:33:48 crc kubenswrapper[4626]: I0223 09:33:48.432367 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerStarted","Data":"acf37742e8e5bf873d255351e754d3e4950a229974661d714adc8ea2fbf0f602"} Feb 23 09:33:48 crc kubenswrapper[4626]: I0223 09:33:48.438200 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:33:49 crc kubenswrapper[4626]: I0223 09:33:49.443636 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerStarted","Data":"bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9"} Feb 23 09:33:49 crc kubenswrapper[4626]: I0223 09:33:49.586954 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6958fdb966-vkk9n_e591795d-67ce-48d5-a54e-2f989878eca9/barbican-api/0.log" Feb 23 09:33:49 crc kubenswrapper[4626]: I0223 09:33:49.745069 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6958fdb966-vkk9n_e591795d-67ce-48d5-a54e-2f989878eca9/barbican-api-log/0.log" Feb 23 09:33:49 crc kubenswrapper[4626]: I0223 09:33:49.886792 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6495996568-xfqgf_3bd22d53-a38c-4579-b6fd-e7934e32ca47/barbican-keystone-listener/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.081941 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6495996568-xfqgf_3bd22d53-a38c-4579-b6fd-e7934e32ca47/barbican-keystone-listener-log/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.132081 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7ccd97cd69-bpkw8_6ec78427-155b-4ed6-8d16-e56f099473c1/barbican-worker/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.177153 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7ccd97cd69-bpkw8_6ec78427-155b-4ed6-8d16-e56f099473c1/barbican-worker-log/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.411360 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-llhc6_5d5da7da-f1d4-4a24-9a9b-e22d85625761/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.531174 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/ceilometer-central-agent/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.700590 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/proxy-httpd/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.721377 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/sg-core/0.log" Feb 23 09:33:50 crc kubenswrapper[4626]: I0223 09:33:50.766555 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d9b9cc07-9e39-487b-85af-eaeaae575087/ceilometer-notification-agent/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.028435 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5f9731f-2161-4757-97a7-e542f744362c/cinder-api-log/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.110339 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5f9731f-2161-4757-97a7-e542f744362c/cinder-api/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.275680 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8/cinder-scheduler/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.322870 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7ee4b2a3-9db7-4cbf-9e5f-bfb347e789c8/probe/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.431083 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lk9vn_17b6f47d-57a1-46e9-be66-1f93b98664c3/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.635832 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zckcd_1ea04624-3b44-4b2b-b89d-7799440e264f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:51 crc kubenswrapper[4626]: I0223 09:33:51.859759 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7cdb55cb5c-2pfm4_5f9703d1-1761-47e8-8524-c52def1bcac3/init/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.271301 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7cdb55cb5c-2pfm4_5f9703d1-1761-47e8-8524-c52def1bcac3/init/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.282717 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jgf7k_b968ab81-8b5f-49c7-830b-220b90d6b1f1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.495868 4626 generic.go:334] "Generic (PLEG): container finished" podID="c2342849-5802-4577-b127-727ec70c3362" containerID="bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9" exitCode=0 Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.495918 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerDied","Data":"bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9"} Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.562468 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7cdb55cb5c-2pfm4_5f9703d1-1761-47e8-8524-c52def1bcac3/dnsmasq-dns/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.619650 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2fc61e5-419c-4dab-9ddb-52bb9de855d5/glance-httpd/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.629017 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e2fc61e5-419c-4dab-9ddb-52bb9de855d5/glance-log/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.822828 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7afc12b1-f684-47fc-bb2f-201f09707ad6/glance-log/0.log" Feb 23 09:33:52 crc kubenswrapper[4626]: I0223 09:33:52.934541 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7afc12b1-f684-47fc-bb2f-201f09707ad6/glance-httpd/0.log" Feb 23 09:33:53 crc kubenswrapper[4626]: I0223 09:33:53.509471 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerStarted","Data":"d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559"} Feb 23 09:33:53 crc kubenswrapper[4626]: I0223 09:33:53.636929 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5d7b45f997-g8dhd_e53266a3-8d3d-44af-b0f7-c48a7170ceac/heat-engine/0.log" Feb 23 09:33:53 crc kubenswrapper[4626]: I0223 09:33:53.903194 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-688bccf86-4crkw_d3e1e535-58de-4987-9d93-65fb6d4c9409/horizon/0.log" Feb 23 09:33:54 crc kubenswrapper[4626]: I0223 09:33:54.312474 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hxsq8_3d1fa218-95df-487b-b4d0-be0da8e72c58/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:54 crc kubenswrapper[4626]: I0223 09:33:54.720841 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ch9l8_f5ac8c56-2109-41c7-8129-5561016dbaef/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:55 crc kubenswrapper[4626]: I0223 09:33:55.030014 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-54768cd758-d6bbc_bdb47037-af0c-4d21-9e61-53b65fb113d1/heat-api/0.log" Feb 23 09:33:55 crc kubenswrapper[4626]: I0223 09:33:55.103063 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7f6ddcf74f-chh24_05f9d0eb-fafc-496f-9fe2-8923f9d8db61/heat-cfnapi/0.log" Feb 23 09:33:55 crc kubenswrapper[4626]: I0223 09:33:55.387378 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530501-tbhcl_4b61fdde-2209-4bd7-b3c1-a8f4123825a1/keystone-cron/0.log" Feb 23 09:33:55 crc kubenswrapper[4626]: I0223 09:33:55.429632 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530561-fjctr_82a9f0b7-fae7-4dbf-8db4-b0f0695a0e7b/keystone-cron/0.log" Feb 23 09:33:55 crc kubenswrapper[4626]: I0223 09:33:55.775390 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530621-vb4gm_c637caea-745d-4de8-9111-addf155f30c3/keystone-cron/0.log" Feb 23 09:33:56 crc kubenswrapper[4626]: I0223 09:33:56.058448 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-688bccf86-4crkw_d3e1e535-58de-4987-9d93-65fb6d4c9409/horizon-log/0.log" Feb 23 09:33:56 crc kubenswrapper[4626]: I0223 09:33:56.180574 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6b6610d6-48cf-4f86-ac4d-603b4bb60f04/kube-state-metrics/0.log" Feb 23 09:33:56 crc kubenswrapper[4626]: I0223 09:33:56.437116 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-lc6cs_a9f59db3-8e35-432d-9dc1-bf70b5de9990/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:56 crc kubenswrapper[4626]: I0223 09:33:56.453785 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6ffd4ff45f-xttfr_9f0e3a3c-7106-4f7e-af92-a329a82fc625/keystone-api/0.log" Feb 23 09:33:57 crc kubenswrapper[4626]: I0223 09:33:57.039984 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5zszb_671613e8-e8c1-40e7-86bf-026acd3864fe/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:33:57 crc kubenswrapper[4626]: I0223 09:33:57.069181 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-647d8d576f-jnrsm_f3b2c5c3-7e78-46b9-8365-396752a27b88/neutron-httpd/0.log" Feb 23 09:33:57 crc kubenswrapper[4626]: I0223 09:33:57.520415 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:57 crc kubenswrapper[4626]: I0223 09:33:57.520463 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:33:57 crc kubenswrapper[4626]: I0223 09:33:57.954513 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-647d8d576f-jnrsm_f3b2c5c3-7e78-46b9-8365-396752a27b88/neutron-api/0.log" Feb 23 09:33:58 crc kubenswrapper[4626]: I0223 09:33:58.559973 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2g6m" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="registry-server" probeResult="failure" output=< Feb 23 09:33:58 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:33:58 crc kubenswrapper[4626]: > Feb 23 09:33:59 crc kubenswrapper[4626]: I0223 09:33:59.092884 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e6972135-5165-4e01-9a21-591d7c07c533/nova-cell0-conductor-conductor/0.log" Feb 23 09:33:59 crc kubenswrapper[4626]: I0223 09:33:59.121303 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a8470cfd-1a2b-4e2b-b59a-10f5de602156/nova-cell1-conductor-conductor/0.log" Feb 23 09:33:59 crc kubenswrapper[4626]: I0223 09:33:59.857104 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bbcfc4cd-8abe-43d1-88c9-7ebb07787fb9/nova-cell1-novncproxy-novncproxy/0.log" Feb 23 09:33:59 crc kubenswrapper[4626]: I0223 09:33:59.993296 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7wfjm_ba805f29-0d45-499e-bc08-00188c51379f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:00 crc kubenswrapper[4626]: I0223 09:34:00.478264 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f00ecb25-d721-43a6-810e-976c03a0572d/nova-metadata-log/0.log" Feb 23 09:34:00 crc kubenswrapper[4626]: I0223 09:34:00.729698 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e3af6bff-9179-4a10-aa63-4227c8933818/nova-api-log/0.log" Feb 23 09:34:01 crc kubenswrapper[4626]: I0223 09:34:01.463513 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03a2b658-d642-449c-be58-94b80484618e/mysql-bootstrap/0.log" Feb 23 09:34:01 crc kubenswrapper[4626]: I0223 09:34:01.698082 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03a2b658-d642-449c-be58-94b80484618e/mysql-bootstrap/0.log" Feb 23 09:34:02 crc kubenswrapper[4626]: I0223 09:34:02.008436 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_03a2b658-d642-449c-be58-94b80484618e/galera/0.log" Feb 23 09:34:02 crc kubenswrapper[4626]: I0223 09:34:02.118929 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_8dcf4a83-5fb6-41df-bf75-af6298f822e1/nova-scheduler-scheduler/0.log" Feb 23 09:34:02 crc kubenswrapper[4626]: I0223 09:34:02.475227 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3b2ea075-8063-4feb-8e91-3160073129ff/mysql-bootstrap/0.log" Feb 23 09:34:02 crc kubenswrapper[4626]: I0223 09:34:02.674685 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3b2ea075-8063-4feb-8e91-3160073129ff/mysql-bootstrap/0.log" Feb 23 09:34:02 crc kubenswrapper[4626]: I0223 09:34:02.796418 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3b2ea075-8063-4feb-8e91-3160073129ff/galera/0.log" Feb 23 09:34:02 crc kubenswrapper[4626]: I0223 09:34:02.989806 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_e3af6bff-9179-4a10-aa63-4227c8933818/nova-api-api/0.log" Feb 23 09:34:03 crc kubenswrapper[4626]: I0223 09:34:03.087727 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_36431530-ec45-4670-bc2c-ababbf867d6f/openstackclient/0.log" Feb 23 09:34:03 crc kubenswrapper[4626]: I0223 09:34:03.264096 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9j9vm_61db3f96-4a68-44bd-82ff-076ba32d9066/ovn-controller/0.log" Feb 23 09:34:03 crc kubenswrapper[4626]: I0223 09:34:03.530450 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-k59cf_707342b7-0d1b-431f-98bb-99af693f57b2/openstack-network-exporter/0.log" Feb 23 09:34:03 crc kubenswrapper[4626]: I0223 09:34:03.723840 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovsdb-server-init/0.log" Feb 23 09:34:04 crc kubenswrapper[4626]: I0223 09:34:04.032791 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovs-vswitchd/0.log" Feb 23 09:34:04 crc kubenswrapper[4626]: I0223 09:34:04.044013 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovsdb-server-init/0.log" Feb 23 09:34:04 crc kubenswrapper[4626]: I0223 09:34:04.114087 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-574ch_afd241f0-61b5-4185-a928-41cf22745048/ovsdb-server/0.log" Feb 23 09:34:04 crc kubenswrapper[4626]: I0223 09:34:04.466592 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-szt5h_97c11447-7070-4233-aaf2-7661d687049d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:04 crc kubenswrapper[4626]: I0223 09:34:04.693827 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9668927a-b529-4f44-a093-41260f069e34/openstack-network-exporter/0.log" Feb 23 09:34:04 crc kubenswrapper[4626]: I0223 09:34:04.771828 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9668927a-b529-4f44-a093-41260f069e34/ovn-northd/0.log" Feb 23 09:34:05 crc kubenswrapper[4626]: I0223 09:34:05.188171 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6b918ead-8c70-463f-b938-948436aa4278/openstack-network-exporter/0.log" Feb 23 09:34:05 crc kubenswrapper[4626]: I0223 09:34:05.339798 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6b918ead-8c70-463f-b938-948436aa4278/ovsdbserver-nb/0.log" Feb 23 09:34:05 crc kubenswrapper[4626]: I0223 09:34:05.531517 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f76a96be-520f-46e8-9e47-4a4d3237359e/openstack-network-exporter/0.log" Feb 23 09:34:05 crc kubenswrapper[4626]: I0223 09:34:05.642282 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f76a96be-520f-46e8-9e47-4a4d3237359e/ovsdbserver-sb/0.log" Feb 23 09:34:06 crc kubenswrapper[4626]: I0223 09:34:06.425659 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3d9572dd-1a1c-4261-a7ba-7538d24d769a/setup-container/0.log" Feb 23 09:34:06 crc kubenswrapper[4626]: I0223 09:34:06.591574 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6cb97bcbf6-sl6hx_296f373f-42ac-474f-bc36-eab630843ed1/placement-api/0.log" Feb 23 09:34:06 crc kubenswrapper[4626]: I0223 09:34:06.686893 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f00ecb25-d721-43a6-810e-976c03a0572d/nova-metadata-metadata/0.log" Feb 23 09:34:06 crc kubenswrapper[4626]: I0223 09:34:06.751294 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3d9572dd-1a1c-4261-a7ba-7538d24d769a/setup-container/0.log" Feb 23 09:34:06 crc kubenswrapper[4626]: I0223 09:34:06.853178 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6cb97bcbf6-sl6hx_296f373f-42ac-474f-bc36-eab630843ed1/placement-log/0.log" Feb 23 09:34:06 crc kubenswrapper[4626]: I0223 09:34:06.918970 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3d9572dd-1a1c-4261-a7ba-7538d24d769a/rabbitmq/0.log" Feb 23 09:34:07 crc kubenswrapper[4626]: I0223 09:34:07.117650 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_969fc15c-8289-4aa6-b590-9fa59f05783b/setup-container/0.log" Feb 23 09:34:07 crc kubenswrapper[4626]: I0223 09:34:07.303337 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_969fc15c-8289-4aa6-b590-9fa59f05783b/setup-container/0.log" Feb 23 09:34:07 crc kubenswrapper[4626]: I0223 09:34:07.544576 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_969fc15c-8289-4aa6-b590-9fa59f05783b/rabbitmq/0.log" Feb 23 09:34:07 crc kubenswrapper[4626]: I0223 09:34:07.566850 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-z2rjk_80dffd5f-db5e-4946-9efe-8137bf36671f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:07 crc kubenswrapper[4626]: I0223 09:34:07.900252 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-cb2zm_1d465e63-5644-4732-a661-1134ffa03a78/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:07 crc kubenswrapper[4626]: I0223 09:34:07.958957 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c782w_5bd897ec-9ff1-4dc4-87c5-db910e8593e4/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.213267 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pxhls_8e11e133-311c-4bd4-9989-a0e05f665f6a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.327048 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-swrn8_28fc407d-b22d-432c-b5c1-7fbb18142e65/ssh-known-hosts-edpm-deployment/0.log" Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.427697 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_38df29c3-b467-498f-9ec1-a83cc91c27ca/memcached/0.log" Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.575910 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2g6m" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="registry-server" probeResult="failure" output=< Feb 23 09:34:08 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:34:08 crc kubenswrapper[4626]: > Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.746953 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6465458495-hgsdz_434e199c-4e18-4274-bbaa-f81f2e2a697b/proxy-server/0.log" Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.803203 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dkrrn_0f4f8444-d09b-4213-be5c-585c699d29ae/swift-ring-rebalance/0.log" Feb 23 09:34:08 crc kubenswrapper[4626]: I0223 09:34:08.913003 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6465458495-hgsdz_434e199c-4e18-4274-bbaa-f81f2e2a697b/proxy-httpd/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.009170 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-auditor/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.085907 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-reaper/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.158097 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-replicator/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.173686 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-auditor/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.206362 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/account-server/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.341286 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-replicator/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.363712 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-server/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.407815 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/container-updater/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.450311 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-auditor/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.519786 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-expirer/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.676966 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-server/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.695552 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-updater/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.725038 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/rsync/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.740742 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/object-replicator/0.log" Feb 23 09:34:09 crc kubenswrapper[4626]: I0223 09:34:09.829201 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5d736eeb-711a-4553-96ff-2b0d9741ac28/swift-recon-cron/0.log" Feb 23 09:34:10 crc kubenswrapper[4626]: I0223 09:34:10.023910 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-multi-thread-testing_71976b26-a20d-4173-98a9-e4d5b553fb8b/tempest-tests-tempest-tests-runner/0.log" Feb 23 09:34:10 crc kubenswrapper[4626]: I0223 09:34:10.068388 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-gkjqr_a8761a5a-7aba-46e2-9070-49cc7e866c7b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:10 crc kubenswrapper[4626]: I0223 09:34:10.442369 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-thread-testing_4774892a-4776-4db1-b74e-d78df11aa97e/tempest-tests-tempest-tests-runner/0.log" Feb 23 09:34:10 crc kubenswrapper[4626]: I0223 09:34:10.527521 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d0209d94-e31f-4614-aa4d-e48ed971fcff/test-operator-logs-container/0.log" Feb 23 09:34:10 crc kubenswrapper[4626]: I0223 09:34:10.666749 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z2ls4_8d4a8ee4-c271-4fe2-b9ea-85a1a176a000/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 23 09:34:17 crc kubenswrapper[4626]: I0223 09:34:17.570361 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:34:17 crc kubenswrapper[4626]: I0223 09:34:17.601193 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k2g6m" podStartSLOduration=26.040955966 podStartE2EDuration="30.599869961s" podCreationTimestamp="2026-02-23 09:33:47 +0000 UTC" firstStartedPulling="2026-02-23 09:33:48.434568032 +0000 UTC m=+10380.773897298" lastFinishedPulling="2026-02-23 09:33:52.993482028 +0000 UTC m=+10385.332811293" observedRunningTime="2026-02-23 09:33:53.529643254 +0000 UTC m=+10385.868972521" watchObservedRunningTime="2026-02-23 09:34:17.599869961 +0000 UTC m=+10409.939199227" Feb 23 09:34:17 crc kubenswrapper[4626]: I0223 09:34:17.625551 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:34:18 crc kubenswrapper[4626]: I0223 09:34:18.405034 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2g6m"] Feb 23 09:34:18 crc kubenswrapper[4626]: I0223 09:34:18.728670 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k2g6m" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="registry-server" containerID="cri-o://d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559" gracePeriod=2 Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.513599 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.598390 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-utilities\") pod \"c2342849-5802-4577-b127-727ec70c3362\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.598517 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-catalog-content\") pod \"c2342849-5802-4577-b127-727ec70c3362\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.598803 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljlx2\" (UniqueName: \"kubernetes.io/projected/c2342849-5802-4577-b127-727ec70c3362-kube-api-access-ljlx2\") pod \"c2342849-5802-4577-b127-727ec70c3362\" (UID: \"c2342849-5802-4577-b127-727ec70c3362\") " Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.599597 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-utilities" (OuterVolumeSpecName: "utilities") pod "c2342849-5802-4577-b127-727ec70c3362" (UID: "c2342849-5802-4577-b127-727ec70c3362"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.617765 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2342849-5802-4577-b127-727ec70c3362-kube-api-access-ljlx2" (OuterVolumeSpecName: "kube-api-access-ljlx2") pod "c2342849-5802-4577-b127-727ec70c3362" (UID: "c2342849-5802-4577-b127-727ec70c3362"). InnerVolumeSpecName "kube-api-access-ljlx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.702442 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljlx2\" (UniqueName: \"kubernetes.io/projected/c2342849-5802-4577-b127-727ec70c3362-kube-api-access-ljlx2\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.702484 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.707706 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2342849-5802-4577-b127-727ec70c3362" (UID: "c2342849-5802-4577-b127-727ec70c3362"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.739450 4626 generic.go:334] "Generic (PLEG): container finished" podID="c2342849-5802-4577-b127-727ec70c3362" containerID="d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559" exitCode=0 Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.739531 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerDied","Data":"d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559"} Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.740697 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2g6m" event={"ID":"c2342849-5802-4577-b127-727ec70c3362","Type":"ContainerDied","Data":"acf37742e8e5bf873d255351e754d3e4950a229974661d714adc8ea2fbf0f602"} Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.739572 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2g6m" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.740735 4626 scope.go:117] "RemoveContainer" containerID="d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.781096 4626 scope.go:117] "RemoveContainer" containerID="bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.794136 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2g6m"] Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.806305 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k2g6m"] Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.806645 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2342849-5802-4577-b127-727ec70c3362-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.822634 4626 scope.go:117] "RemoveContainer" containerID="0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.873344 4626 scope.go:117] "RemoveContainer" containerID="d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559" Feb 23 09:34:19 crc kubenswrapper[4626]: E0223 09:34:19.875684 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559\": container with ID starting with d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559 not found: ID does not exist" containerID="d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.875801 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559"} err="failed to get container status \"d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559\": rpc error: code = NotFound desc = could not find container \"d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559\": container with ID starting with d06b49c48599a7df50adcfeab3222cd8ff2ab384c42c8fc029d7a58b470ad559 not found: ID does not exist" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.875898 4626 scope.go:117] "RemoveContainer" containerID="bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9" Feb 23 09:34:19 crc kubenswrapper[4626]: E0223 09:34:19.882752 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9\": container with ID starting with bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9 not found: ID does not exist" containerID="bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.882882 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9"} err="failed to get container status \"bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9\": rpc error: code = NotFound desc = could not find container \"bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9\": container with ID starting with bba0f2deb904d362f079c4b1673ec6ab14499ebe5e025589623542a785dba3d9 not found: ID does not exist" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.883042 4626 scope.go:117] "RemoveContainer" containerID="0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720" Feb 23 09:34:19 crc kubenswrapper[4626]: E0223 09:34:19.885747 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720\": container with ID starting with 0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720 not found: ID does not exist" containerID="0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.885792 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720"} err="failed to get container status \"0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720\": rpc error: code = NotFound desc = could not find container \"0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720\": container with ID starting with 0a7a3c3081633fa8f177972c99fa3fb77598a3ae329526bc652173858d89a720 not found: ID does not exist" Feb 23 09:34:19 crc kubenswrapper[4626]: I0223 09:34:19.994852 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2342849-5802-4577-b127-727ec70c3362" path="/var/lib/kubelet/pods/c2342849-5802-4577-b127-727ec70c3362/volumes" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.906727 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zcfd"] Feb 23 09:34:30 crc kubenswrapper[4626]: E0223 09:34:30.907963 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="extract-utilities" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.907980 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="extract-utilities" Feb 23 09:34:30 crc kubenswrapper[4626]: E0223 09:34:30.908007 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="extract-content" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.908014 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="extract-content" Feb 23 09:34:30 crc kubenswrapper[4626]: E0223 09:34:30.908030 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="registry-server" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.908040 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="registry-server" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.908322 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2342849-5802-4577-b127-727ec70c3362" containerName="registry-server" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.910007 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:30 crc kubenswrapper[4626]: I0223 09:34:30.929814 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zcfd"] Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.062852 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2r4z\" (UniqueName: \"kubernetes.io/projected/673611fb-301f-41d5-87ee-cbd1e883fdde-kube-api-access-n2r4z\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.062920 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-catalog-content\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.063548 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-utilities\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.165932 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2r4z\" (UniqueName: \"kubernetes.io/projected/673611fb-301f-41d5-87ee-cbd1e883fdde-kube-api-access-n2r4z\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.166015 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-catalog-content\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.166141 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-utilities\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.166782 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-utilities\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.166865 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-catalog-content\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.186162 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2r4z\" (UniqueName: \"kubernetes.io/projected/673611fb-301f-41d5-87ee-cbd1e883fdde-kube-api-access-n2r4z\") pod \"community-operators-5zcfd\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.230627 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.732810 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zcfd"] Feb 23 09:34:31 crc kubenswrapper[4626]: I0223 09:34:31.851427 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerStarted","Data":"c101b0b885963b0cab2e1e36dcb9b946333e487f5efdea94250b50beb65dc9d9"} Feb 23 09:34:32 crc kubenswrapper[4626]: I0223 09:34:32.864641 4626 generic.go:334] "Generic (PLEG): container finished" podID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerID="1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19" exitCode=0 Feb 23 09:34:32 crc kubenswrapper[4626]: I0223 09:34:32.864724 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerDied","Data":"1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19"} Feb 23 09:34:33 crc kubenswrapper[4626]: I0223 09:34:33.901658 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerStarted","Data":"7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4"} Feb 23 09:34:35 crc kubenswrapper[4626]: I0223 09:34:35.921765 4626 generic.go:334] "Generic (PLEG): container finished" podID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerID="7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4" exitCode=0 Feb 23 09:34:35 crc kubenswrapper[4626]: I0223 09:34:35.921821 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerDied","Data":"7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4"} Feb 23 09:34:36 crc kubenswrapper[4626]: I0223 09:34:36.931626 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerStarted","Data":"2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc"} Feb 23 09:34:36 crc kubenswrapper[4626]: I0223 09:34:36.958306 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zcfd" podStartSLOduration=3.437766302 podStartE2EDuration="6.958290666s" podCreationTimestamp="2026-02-23 09:34:30 +0000 UTC" firstStartedPulling="2026-02-23 09:34:32.867573855 +0000 UTC m=+10425.206903121" lastFinishedPulling="2026-02-23 09:34:36.388098219 +0000 UTC m=+10428.727427485" observedRunningTime="2026-02-23 09:34:36.951212601 +0000 UTC m=+10429.290541867" watchObservedRunningTime="2026-02-23 09:34:36.958290666 +0000 UTC m=+10429.297619932" Feb 23 09:34:37 crc kubenswrapper[4626]: I0223 09:34:37.763857 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/util/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.169921 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/util/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.205741 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/pull/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.242051 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/pull/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.406043 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/util/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.432880 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/extract/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.438185 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79677gfqq_57abc151-5a63-4be9-a7e5-eca0e831bdb9/pull/0.log" Feb 23 09:34:38 crc kubenswrapper[4626]: I0223 09:34:38.898335 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-ljvjt_d2c9303d-be86-451a-8834-67abc679952b/manager/0.log" Feb 23 09:34:39 crc kubenswrapper[4626]: I0223 09:34:39.321105 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-6h2w7_805fd015-c983-4199-b12e-c0073d645e3b/manager/0.log" Feb 23 09:34:39 crc kubenswrapper[4626]: I0223 09:34:39.685753 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-jlcjb_984c81c1-d4bf-417c-9937-d5de29d33a00/manager/0.log" Feb 23 09:34:40 crc kubenswrapper[4626]: I0223 09:34:40.007878 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-dzl5c_35a1fc39-9675-4591-95a7-aa1ff016b779/manager/0.log" Feb 23 09:34:40 crc kubenswrapper[4626]: I0223 09:34:40.586929 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-qqpl6_a05a1f00-3183-484d-9c35-7db986a84e8a/manager/0.log" Feb 23 09:34:40 crc kubenswrapper[4626]: I0223 09:34:40.770975 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-6qn7p_47ab4bc8-6cb2-4aeb-b0be-2bf4b069abb5/manager/0.log" Feb 23 09:34:41 crc kubenswrapper[4626]: I0223 09:34:41.216445 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-zsrhb_682961f6-ea8c-4883-93e3-af65115c9507/manager/0.log" Feb 23 09:34:41 crc kubenswrapper[4626]: I0223 09:34:41.236901 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:41 crc kubenswrapper[4626]: I0223 09:34:41.236944 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:41 crc kubenswrapper[4626]: I0223 09:34:41.578908 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-g6gzq_8c4d609a-2d99-44b9-9c86-e20a3965381b/manager/0.log" Feb 23 09:34:41 crc kubenswrapper[4626]: I0223 09:34:41.981660 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-j524z_cfc5b69c-d190-4f73-a311-c9a371762530/manager/0.log" Feb 23 09:34:42 crc kubenswrapper[4626]: I0223 09:34:42.288461 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5zcfd" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="registry-server" probeResult="failure" output=< Feb 23 09:34:42 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:34:42 crc kubenswrapper[4626]: > Feb 23 09:34:42 crc kubenswrapper[4626]: I0223 09:34:42.342811 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-mnbrh_98f392cd-76ee-4062-8ad1-15608b3658dc/manager/0.log" Feb 23 09:34:42 crc kubenswrapper[4626]: I0223 09:34:42.481490 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-r6gc2_4f909aac-ac8d-4ec5-8404-e9c1f77a144c/manager/0.log" Feb 23 09:34:42 crc kubenswrapper[4626]: I0223 09:34:42.693936 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-8nqw2_df31bf79-feee-4644-a1eb-bd6d5af05d7f/manager/0.log" Feb 23 09:34:43 crc kubenswrapper[4626]: I0223 09:34:43.113947 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-9wx75_d8a75041-f9ba-4691-9467-f20f9205daa6/manager/0.log" Feb 23 09:34:43 crc kubenswrapper[4626]: I0223 09:34:43.669203 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-vqprw_8e6ad56b-628d-40a0-b847-0e0b0040ad46/operator/0.log" Feb 23 09:34:44 crc kubenswrapper[4626]: I0223 09:34:44.353181 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dqtlp_0a4018ed-6a39-4a3b-8d79-3ce2ef923ac7/registry-server/0.log" Feb 23 09:34:44 crc kubenswrapper[4626]: I0223 09:34:44.469430 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-vxbs9_8f5c3859-f8dd-42e6-97ee-84fe1bc5d02a/manager/0.log" Feb 23 09:34:44 crc kubenswrapper[4626]: I0223 09:34:44.821593 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-db9df_1867cac1-c043-4677-9c64-786b1f261fd5/manager/0.log" Feb 23 09:34:45 crc kubenswrapper[4626]: I0223 09:34:45.115735 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vzwhb_02bcecce-5716-45c6-8d40-f1da91d26673/operator/0.log" Feb 23 09:34:45 crc kubenswrapper[4626]: I0223 09:34:45.386542 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-hv75x_7f639fb4-3160-4bd7-ab2a-86ea80cb51ed/manager/0.log" Feb 23 09:34:45 crc kubenswrapper[4626]: I0223 09:34:45.778321 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-vdx4s_6f3e393a-a659-44ae-bad8-6e4ff2d649ce/manager/0.log" Feb 23 09:34:45 crc kubenswrapper[4626]: I0223 09:34:45.941250 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-mf52c_fab598e9-890d-4f24-b26b-8f5b507a86c8/manager/0.log" Feb 23 09:34:46 crc kubenswrapper[4626]: I0223 09:34:46.015532 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-tzh2l_d9ef444e-448b-4576-960b-5861b7c19720/manager/0.log" Feb 23 09:34:46 crc kubenswrapper[4626]: I0223 09:34:46.165181 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-qfd84_cbbbb529-024e-4f9e-ad1c-063c63f39324/manager/0.log" Feb 23 09:34:46 crc kubenswrapper[4626]: I0223 09:34:46.854358 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-j9cqz_7e5b7475-eb3d-4c76-955d-0d9948cf2fe7/manager/0.log" Feb 23 09:34:51 crc kubenswrapper[4626]: I0223 09:34:51.277019 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:51 crc kubenswrapper[4626]: I0223 09:34:51.321153 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:51 crc kubenswrapper[4626]: I0223 09:34:51.532749 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zcfd"] Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.107730 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zcfd" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="registry-server" containerID="cri-o://2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc" gracePeriod=2 Feb 23 09:34:53 crc kubenswrapper[4626]: E0223 09:34:53.339733 4626 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod673611fb_301f_41d5_87ee_cbd1e883fdde.slice/crio-conmon-2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod673611fb_301f_41d5_87ee_cbd1e883fdde.slice/crio-2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc.scope\": RecentStats: unable to find data in memory cache]" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.667318 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.718612 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-utilities\") pod \"673611fb-301f-41d5-87ee-cbd1e883fdde\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.718683 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-catalog-content\") pod \"673611fb-301f-41d5-87ee-cbd1e883fdde\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.718721 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2r4z\" (UniqueName: \"kubernetes.io/projected/673611fb-301f-41d5-87ee-cbd1e883fdde-kube-api-access-n2r4z\") pod \"673611fb-301f-41d5-87ee-cbd1e883fdde\" (UID: \"673611fb-301f-41d5-87ee-cbd1e883fdde\") " Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.719286 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-utilities" (OuterVolumeSpecName: "utilities") pod "673611fb-301f-41d5-87ee-cbd1e883fdde" (UID: "673611fb-301f-41d5-87ee-cbd1e883fdde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.719515 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.750301 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/673611fb-301f-41d5-87ee-cbd1e883fdde-kube-api-access-n2r4z" (OuterVolumeSpecName: "kube-api-access-n2r4z") pod "673611fb-301f-41d5-87ee-cbd1e883fdde" (UID: "673611fb-301f-41d5-87ee-cbd1e883fdde"). InnerVolumeSpecName "kube-api-access-n2r4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.779203 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "673611fb-301f-41d5-87ee-cbd1e883fdde" (UID: "673611fb-301f-41d5-87ee-cbd1e883fdde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.822619 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/673611fb-301f-41d5-87ee-cbd1e883fdde-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:53 crc kubenswrapper[4626]: I0223 09:34:53.822655 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2r4z\" (UniqueName: \"kubernetes.io/projected/673611fb-301f-41d5-87ee-cbd1e883fdde-kube-api-access-n2r4z\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.118545 4626 generic.go:334] "Generic (PLEG): container finished" podID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerID="2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc" exitCode=0 Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.118614 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerDied","Data":"2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc"} Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.118653 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zcfd" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.118679 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zcfd" event={"ID":"673611fb-301f-41d5-87ee-cbd1e883fdde","Type":"ContainerDied","Data":"c101b0b885963b0cab2e1e36dcb9b946333e487f5efdea94250b50beb65dc9d9"} Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.118703 4626 scope.go:117] "RemoveContainer" containerID="2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.144126 4626 scope.go:117] "RemoveContainer" containerID="7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.167036 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zcfd"] Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.174703 4626 scope.go:117] "RemoveContainer" containerID="1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.176400 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zcfd"] Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.204798 4626 scope.go:117] "RemoveContainer" containerID="2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc" Feb 23 09:34:54 crc kubenswrapper[4626]: E0223 09:34:54.205900 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc\": container with ID starting with 2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc not found: ID does not exist" containerID="2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.205950 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc"} err="failed to get container status \"2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc\": rpc error: code = NotFound desc = could not find container \"2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc\": container with ID starting with 2e54d19647e367aaa6345bf0535a83d264b679fdad99cd9616fbeba0d4b67ffc not found: ID does not exist" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.205982 4626 scope.go:117] "RemoveContainer" containerID="7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4" Feb 23 09:34:54 crc kubenswrapper[4626]: E0223 09:34:54.206246 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4\": container with ID starting with 7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4 not found: ID does not exist" containerID="7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.206270 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4"} err="failed to get container status \"7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4\": rpc error: code = NotFound desc = could not find container \"7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4\": container with ID starting with 7d7610d34eed528d2d7b4226d197688ea453c1f5f8daf7961abe46301262dba4 not found: ID does not exist" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.206286 4626 scope.go:117] "RemoveContainer" containerID="1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19" Feb 23 09:34:54 crc kubenswrapper[4626]: E0223 09:34:54.207233 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19\": container with ID starting with 1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19 not found: ID does not exist" containerID="1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.207259 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19"} err="failed to get container status \"1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19\": rpc error: code = NotFound desc = could not find container \"1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19\": container with ID starting with 1b1374cd0b6891113cd018a034fe6cad03206d761537f16f268991873de03f19 not found: ID does not exist" Feb 23 09:34:54 crc kubenswrapper[4626]: I0223 09:34:54.796208 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-867lr_d20a75f5-68d0-4f32-bea9-62fdac3a3498/manager/0.log" Feb 23 09:34:55 crc kubenswrapper[4626]: I0223 09:34:55.685837 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:34:55 crc kubenswrapper[4626]: I0223 09:34:55.686462 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:34:55 crc kubenswrapper[4626]: I0223 09:34:55.991545 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" path="/var/lib/kubelet/pods/673611fb-301f-41d5-87ee-cbd1e883fdde/volumes" Feb 23 09:35:09 crc kubenswrapper[4626]: I0223 09:35:09.305879 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-q9kk6_db82ca2b-5eac-4858-8808-7b6e22af0e26/control-plane-machine-set-operator/0.log" Feb 23 09:35:09 crc kubenswrapper[4626]: I0223 09:35:09.494872 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qnh2c_77d47a11-2a3a-4803-8f4b-3bfe07c27e00/kube-rbac-proxy/0.log" Feb 23 09:35:09 crc kubenswrapper[4626]: I0223 09:35:09.576972 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qnh2c_77d47a11-2a3a-4803-8f4b-3bfe07c27e00/machine-api-operator/0.log" Feb 23 09:35:22 crc kubenswrapper[4626]: I0223 09:35:22.228012 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-mlgd2_ab37c078-9524-4dd2-b8f3-450a17f5255d/cert-manager-controller/0.log" Feb 23 09:35:22 crc kubenswrapper[4626]: I0223 09:35:22.468399 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-r5bt4_c7d90dff-0264-49d6-9d9e-ed5063ee6976/cert-manager-cainjector/0.log" Feb 23 09:35:22 crc kubenswrapper[4626]: I0223 09:35:22.526533 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-66srh_4f172a38-4eaf-468a-99d8-99416128eef9/cert-manager-webhook/0.log" Feb 23 09:35:25 crc kubenswrapper[4626]: I0223 09:35:25.685714 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:35:25 crc kubenswrapper[4626]: I0223 09:35:25.686348 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:35:34 crc kubenswrapper[4626]: I0223 09:35:34.952055 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-s4n9r_ba7e56b0-e6c6-434c-9f46-c4526c1448f7/nmstate-console-plugin/0.log" Feb 23 09:35:35 crc kubenswrapper[4626]: I0223 09:35:35.155909 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-5dgzp_89312997-cb41-4e39-9fb3-bb07a7b5d7b6/kube-rbac-proxy/0.log" Feb 23 09:35:35 crc kubenswrapper[4626]: I0223 09:35:35.167215 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6fccn_0b5fd810-1004-455d-ac3c-b7d5fc387861/nmstate-handler/0.log" Feb 23 09:35:35 crc kubenswrapper[4626]: I0223 09:35:35.248645 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-5dgzp_89312997-cb41-4e39-9fb3-bb07a7b5d7b6/nmstate-metrics/0.log" Feb 23 09:35:35 crc kubenswrapper[4626]: I0223 09:35:35.349054 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-7j8mq_33276922-ab9c-4bb0-ad0a-71ca54766841/nmstate-operator/0.log" Feb 23 09:35:35 crc kubenswrapper[4626]: I0223 09:35:35.513422 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-pvm5b_fbb43c1e-8e84-47ab-8bca-d2b1fc06efce/nmstate-webhook/0.log" Feb 23 09:35:55 crc kubenswrapper[4626]: I0223 09:35:55.685429 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:35:55 crc kubenswrapper[4626]: I0223 09:35:55.686016 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:35:55 crc kubenswrapper[4626]: I0223 09:35:55.686074 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:35:55 crc kubenswrapper[4626]: I0223 09:35:55.687056 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9b2b1dc10e03fbf1190e5ec1f19a1cd0f71f9331cc439eaba9e224b7d34e112"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:35:55 crc kubenswrapper[4626]: I0223 09:35:55.687108 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://b9b2b1dc10e03fbf1190e5ec1f19a1cd0f71f9331cc439eaba9e224b7d34e112" gracePeriod=600 Feb 23 09:35:56 crc kubenswrapper[4626]: I0223 09:35:56.704365 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="b9b2b1dc10e03fbf1190e5ec1f19a1cd0f71f9331cc439eaba9e224b7d34e112" exitCode=0 Feb 23 09:35:56 crc kubenswrapper[4626]: I0223 09:35:56.704434 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"b9b2b1dc10e03fbf1190e5ec1f19a1cd0f71f9331cc439eaba9e224b7d34e112"} Feb 23 09:35:56 crc kubenswrapper[4626]: I0223 09:35:56.705268 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerStarted","Data":"ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384"} Feb 23 09:35:56 crc kubenswrapper[4626]: I0223 09:35:56.705307 4626 scope.go:117] "RemoveContainer" containerID="d9d49a4537147dff92d4e5483ab1ac814de7598a0b503bfc1261dfa4c9a9b3e0" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.049060 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-94xt8_52957c0d-2568-459b-a83c-635bbd08c164/controller/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.088180 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-94xt8_52957c0d-2568-459b-a83c-635bbd08c164/kube-rbac-proxy/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.302173 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.496565 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.524857 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.559447 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.562882 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.752825 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.769068 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.808292 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:36:04 crc kubenswrapper[4626]: I0223 09:36:04.860323 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.128317 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-frr-files/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.133549 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-reloader/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.139233 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/cp-metrics/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.178082 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/controller/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.346003 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/frr-metrics/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.391518 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/kube-rbac-proxy-frr/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.396415 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/kube-rbac-proxy/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.672222 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/reloader/0.log" Feb 23 09:36:05 crc kubenswrapper[4626]: I0223 09:36:05.724760 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-tvj6h_06339724-62db-4cc8-930b-5ce2572b46da/frr-k8s-webhook-server/0.log" Feb 23 09:36:06 crc kubenswrapper[4626]: I0223 09:36:06.058565 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7df8f6cc8-jqkkq_5605b9df-eb52-4cb0-8f48-a273404aaf5d/manager/0.log" Feb 23 09:36:06 crc kubenswrapper[4626]: I0223 09:36:06.381114 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cnjgx_a86df0ed-d52d-4095-aed6-b298542a1c2e/kube-rbac-proxy/0.log" Feb 23 09:36:06 crc kubenswrapper[4626]: I0223 09:36:06.448921 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b49bc4cb8-dxzbr_e6756a3b-e295-453e-a3a6-1bc81275c97b/webhook-server/0.log" Feb 23 09:36:07 crc kubenswrapper[4626]: I0223 09:36:07.207218 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cnjgx_a86df0ed-d52d-4095-aed6-b298542a1c2e/speaker/0.log" Feb 23 09:36:07 crc kubenswrapper[4626]: I0223 09:36:07.379448 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jxvxj_d6cc0120-fd47-4d5b-86e7-35e1f3e46bbd/frr/0.log" Feb 23 09:36:21 crc kubenswrapper[4626]: I0223 09:36:21.589090 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/util/0.log" Feb 23 09:36:21 crc kubenswrapper[4626]: I0223 09:36:21.878104 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/util/0.log" Feb 23 09:36:21 crc kubenswrapper[4626]: I0223 09:36:21.881157 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/pull/0.log" Feb 23 09:36:21 crc kubenswrapper[4626]: I0223 09:36:21.963277 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/pull/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.087518 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/util/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.183969 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/pull/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.196749 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2139ckcv_24e5b70d-095c-49f0-93d4-89ba91e35a57/extract/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.341014 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-utilities/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.646238 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-content/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.693934 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-content/0.log" Feb 23 09:36:22 crc kubenswrapper[4626]: I0223 09:36:22.737794 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-utilities/0.log" Feb 23 09:36:23 crc kubenswrapper[4626]: I0223 09:36:23.003868 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-utilities/0.log" Feb 23 09:36:23 crc kubenswrapper[4626]: I0223 09:36:23.047193 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/extract-content/0.log" Feb 23 09:36:23 crc kubenswrapper[4626]: I0223 09:36:23.429691 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-utilities/0.log" Feb 23 09:36:23 crc kubenswrapper[4626]: I0223 09:36:23.671390 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-content/0.log" Feb 23 09:36:23 crc kubenswrapper[4626]: I0223 09:36:23.797742 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-utilities/0.log" Feb 23 09:36:23 crc kubenswrapper[4626]: I0223 09:36:23.850802 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-content/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.029630 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-ptj6b_b9fa1da3-e2df-48d0-99ca-ea2c952f8c49/registry-server/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.099708 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-utilities/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.197176 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/extract-content/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.352618 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/util/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.804238 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/util/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.848543 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/pull/0.log" Feb 23 09:36:24 crc kubenswrapper[4626]: I0223 09:36:24.857779 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/pull/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.086336 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/util/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.124716 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/extract/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.214814 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawd9nh_d5f558ac-cd5c-4de8-853a-e859cdb09641/pull/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.403940 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-222dt_34eadfde-147c-470f-bf62-db7e15fbf337/marketplace-operator/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.442942 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fw4n6_330ef0cc-5cfd-445b-ab4a-76df383091f0/registry-server/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.595029 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-utilities/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.831195 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-utilities/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.840370 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-content/0.log" Feb 23 09:36:25 crc kubenswrapper[4626]: I0223 09:36:25.886781 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-content/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.037340 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-utilities/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.048775 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/extract-content/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.380110 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvgms_cbe4020a-d6b4-48ac-93bd-9afc54de6668/registry-server/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.506843 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-utilities/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.678184 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-utilities/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.700051 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-content/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.713999 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-content/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.875575 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-content/0.log" Feb 23 09:36:26 crc kubenswrapper[4626]: I0223 09:36:26.906533 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/extract-utilities/0.log" Feb 23 09:36:28 crc kubenswrapper[4626]: I0223 09:36:28.059795 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sll72_e6570124-6817-4052-89d0-179b1556ea3e/registry-server/0.log" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.208524 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xwsd7"] Feb 23 09:36:48 crc kubenswrapper[4626]: E0223 09:36:48.217848 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="extract-content" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.217893 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="extract-content" Feb 23 09:36:48 crc kubenswrapper[4626]: E0223 09:36:48.217910 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="registry-server" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.217916 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="registry-server" Feb 23 09:36:48 crc kubenswrapper[4626]: E0223 09:36:48.217939 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="extract-utilities" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.217945 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="extract-utilities" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.218699 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="673611fb-301f-41d5-87ee-cbd1e883fdde" containerName="registry-server" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.224033 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.242692 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwsd7"] Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.253565 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqws4\" (UniqueName: \"kubernetes.io/projected/2cfaf627-b1e9-40c8-914b-ca56197c9a63-kube-api-access-rqws4\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.253626 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-catalog-content\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.253685 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-utilities\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.356022 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqws4\" (UniqueName: \"kubernetes.io/projected/2cfaf627-b1e9-40c8-914b-ca56197c9a63-kube-api-access-rqws4\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.356077 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-catalog-content\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.356155 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-utilities\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.356642 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-catalog-content\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.356711 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-utilities\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.383951 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqws4\" (UniqueName: \"kubernetes.io/projected/2cfaf627-b1e9-40c8-914b-ca56197c9a63-kube-api-access-rqws4\") pod \"redhat-marketplace-xwsd7\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:48 crc kubenswrapper[4626]: I0223 09:36:48.547047 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:49 crc kubenswrapper[4626]: I0223 09:36:49.572667 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwsd7"] Feb 23 09:36:50 crc kubenswrapper[4626]: I0223 09:36:50.291145 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwsd7" event={"ID":"2cfaf627-b1e9-40c8-914b-ca56197c9a63","Type":"ContainerDied","Data":"547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58"} Feb 23 09:36:50 crc kubenswrapper[4626]: I0223 09:36:50.292295 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerID="547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58" exitCode=0 Feb 23 09:36:50 crc kubenswrapper[4626]: I0223 09:36:50.292358 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwsd7" event={"ID":"2cfaf627-b1e9-40c8-914b-ca56197c9a63","Type":"ContainerStarted","Data":"8e531f521a5d3328d308f4f81c8bf6baa4771ad32739088cc33639b4085a01a4"} Feb 23 09:36:52 crc kubenswrapper[4626]: I0223 09:36:52.355013 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerID="93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262" exitCode=0 Feb 23 09:36:52 crc kubenswrapper[4626]: I0223 09:36:52.355830 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwsd7" event={"ID":"2cfaf627-b1e9-40c8-914b-ca56197c9a63","Type":"ContainerDied","Data":"93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262"} Feb 23 09:36:53 crc kubenswrapper[4626]: I0223 09:36:53.369127 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwsd7" event={"ID":"2cfaf627-b1e9-40c8-914b-ca56197c9a63","Type":"ContainerStarted","Data":"f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a"} Feb 23 09:36:53 crc kubenswrapper[4626]: I0223 09:36:53.397725 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xwsd7" podStartSLOduration=2.833421823 podStartE2EDuration="5.395082827s" podCreationTimestamp="2026-02-23 09:36:48 +0000 UTC" firstStartedPulling="2026-02-23 09:36:50.291065287 +0000 UTC m=+10562.630394543" lastFinishedPulling="2026-02-23 09:36:52.85272628 +0000 UTC m=+10565.192055547" observedRunningTime="2026-02-23 09:36:53.394179965 +0000 UTC m=+10565.733509231" watchObservedRunningTime="2026-02-23 09:36:53.395082827 +0000 UTC m=+10565.734412094" Feb 23 09:36:58 crc kubenswrapper[4626]: I0223 09:36:58.549932 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:58 crc kubenswrapper[4626]: I0223 09:36:58.551590 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:36:59 crc kubenswrapper[4626]: I0223 09:36:59.600318 4626 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xwsd7" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="registry-server" probeResult="failure" output=< Feb 23 09:36:59 crc kubenswrapper[4626]: timeout: failed to connect service ":50051" within 1s Feb 23 09:36:59 crc kubenswrapper[4626]: > Feb 23 09:37:08 crc kubenswrapper[4626]: I0223 09:37:08.593713 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:37:08 crc kubenswrapper[4626]: I0223 09:37:08.642056 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:37:08 crc kubenswrapper[4626]: I0223 09:37:08.851738 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwsd7"] Feb 23 09:37:10 crc kubenswrapper[4626]: I0223 09:37:10.554945 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xwsd7" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="registry-server" containerID="cri-o://f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a" gracePeriod=2 Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.012963 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.098530 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqws4\" (UniqueName: \"kubernetes.io/projected/2cfaf627-b1e9-40c8-914b-ca56197c9a63-kube-api-access-rqws4\") pod \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.098621 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-utilities\") pod \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.098878 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-catalog-content\") pod \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\" (UID: \"2cfaf627-b1e9-40c8-914b-ca56197c9a63\") " Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.100226 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-utilities" (OuterVolumeSpecName: "utilities") pod "2cfaf627-b1e9-40c8-914b-ca56197c9a63" (UID: "2cfaf627-b1e9-40c8-914b-ca56197c9a63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.118106 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cfaf627-b1e9-40c8-914b-ca56197c9a63-kube-api-access-rqws4" (OuterVolumeSpecName: "kube-api-access-rqws4") pod "2cfaf627-b1e9-40c8-914b-ca56197c9a63" (UID: "2cfaf627-b1e9-40c8-914b-ca56197c9a63"). InnerVolumeSpecName "kube-api-access-rqws4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.120025 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cfaf627-b1e9-40c8-914b-ca56197c9a63" (UID: "2cfaf627-b1e9-40c8-914b-ca56197c9a63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.202424 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.202469 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqws4\" (UniqueName: \"kubernetes.io/projected/2cfaf627-b1e9-40c8-914b-ca56197c9a63-kube-api-access-rqws4\") on node \"crc\" DevicePath \"\"" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.202483 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cfaf627-b1e9-40c8-914b-ca56197c9a63-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.568297 4626 generic.go:334] "Generic (PLEG): container finished" podID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerID="f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a" exitCode=0 Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.568437 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwsd7" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.568422 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwsd7" event={"ID":"2cfaf627-b1e9-40c8-914b-ca56197c9a63","Type":"ContainerDied","Data":"f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a"} Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.569536 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwsd7" event={"ID":"2cfaf627-b1e9-40c8-914b-ca56197c9a63","Type":"ContainerDied","Data":"8e531f521a5d3328d308f4f81c8bf6baa4771ad32739088cc33639b4085a01a4"} Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.571802 4626 scope.go:117] "RemoveContainer" containerID="f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.617233 4626 scope.go:117] "RemoveContainer" containerID="93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.620468 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwsd7"] Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.635415 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwsd7"] Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.643075 4626 scope.go:117] "RemoveContainer" containerID="547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.680655 4626 scope.go:117] "RemoveContainer" containerID="f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a" Feb 23 09:37:11 crc kubenswrapper[4626]: E0223 09:37:11.681582 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a\": container with ID starting with f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a not found: ID does not exist" containerID="f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.682151 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a"} err="failed to get container status \"f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a\": rpc error: code = NotFound desc = could not find container \"f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a\": container with ID starting with f56aa9fe3a87ad7d0837a420dbdd07e71b57f895608e5df4e51c6922f433e47a not found: ID does not exist" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.682193 4626 scope.go:117] "RemoveContainer" containerID="93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262" Feb 23 09:37:11 crc kubenswrapper[4626]: E0223 09:37:11.682595 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262\": container with ID starting with 93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262 not found: ID does not exist" containerID="93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.682635 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262"} err="failed to get container status \"93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262\": rpc error: code = NotFound desc = could not find container \"93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262\": container with ID starting with 93b12e7692a8dbcb2d49ebc2b2b02267ab6ce06a061711959fa8979996230262 not found: ID does not exist" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.682664 4626 scope.go:117] "RemoveContainer" containerID="547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58" Feb 23 09:37:11 crc kubenswrapper[4626]: E0223 09:37:11.683054 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58\": container with ID starting with 547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58 not found: ID does not exist" containerID="547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.683089 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58"} err="failed to get container status \"547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58\": rpc error: code = NotFound desc = could not find container \"547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58\": container with ID starting with 547a7331bb1411158305df30447a8bd6d57572cd61d1bb318040d5f078f22b58 not found: ID does not exist" Feb 23 09:37:11 crc kubenswrapper[4626]: I0223 09:37:11.993643 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" path="/var/lib/kubelet/pods/2cfaf627-b1e9-40c8-914b-ca56197c9a63/volumes" Feb 23 09:38:25 crc kubenswrapper[4626]: I0223 09:38:25.685824 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:38:25 crc kubenswrapper[4626]: I0223 09:38:25.686845 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:38:49 crc kubenswrapper[4626]: I0223 09:38:49.519690 4626 generic.go:334] "Generic (PLEG): container finished" podID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerID="b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908" exitCode=0 Feb 23 09:38:49 crc kubenswrapper[4626]: I0223 09:38:49.519772 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f4swc/must-gather-jj8rt" event={"ID":"428281f6-5309-4718-aeec-82bdc3d2bf08","Type":"ContainerDied","Data":"b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908"} Feb 23 09:38:49 crc kubenswrapper[4626]: I0223 09:38:49.521400 4626 scope.go:117] "RemoveContainer" containerID="b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908" Feb 23 09:38:50 crc kubenswrapper[4626]: I0223 09:38:50.033376 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f4swc_must-gather-jj8rt_428281f6-5309-4718-aeec-82bdc3d2bf08/gather/0.log" Feb 23 09:38:55 crc kubenswrapper[4626]: I0223 09:38:55.685050 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:38:55 crc kubenswrapper[4626]: I0223 09:38:55.686216 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:39:03 crc kubenswrapper[4626]: I0223 09:39:03.929923 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f4swc/must-gather-jj8rt"] Feb 23 09:39:03 crc kubenswrapper[4626]: I0223 09:39:03.931066 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f4swc/must-gather-jj8rt" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="copy" containerID="cri-o://c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645" gracePeriod=2 Feb 23 09:39:03 crc kubenswrapper[4626]: I0223 09:39:03.941939 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f4swc/must-gather-jj8rt"] Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.429243 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f4swc_must-gather-jj8rt_428281f6-5309-4718-aeec-82bdc3d2bf08/copy/0.log" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.432107 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.525893 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mt77\" (UniqueName: \"kubernetes.io/projected/428281f6-5309-4718-aeec-82bdc3d2bf08-kube-api-access-4mt77\") pod \"428281f6-5309-4718-aeec-82bdc3d2bf08\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.525958 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/428281f6-5309-4718-aeec-82bdc3d2bf08-must-gather-output\") pod \"428281f6-5309-4718-aeec-82bdc3d2bf08\" (UID: \"428281f6-5309-4718-aeec-82bdc3d2bf08\") " Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.547271 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/428281f6-5309-4718-aeec-82bdc3d2bf08-kube-api-access-4mt77" (OuterVolumeSpecName: "kube-api-access-4mt77") pod "428281f6-5309-4718-aeec-82bdc3d2bf08" (UID: "428281f6-5309-4718-aeec-82bdc3d2bf08"). InnerVolumeSpecName "kube-api-access-4mt77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.630091 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mt77\" (UniqueName: \"kubernetes.io/projected/428281f6-5309-4718-aeec-82bdc3d2bf08-kube-api-access-4mt77\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.674249 4626 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f4swc_must-gather-jj8rt_428281f6-5309-4718-aeec-82bdc3d2bf08/copy/0.log" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.674897 4626 generic.go:334] "Generic (PLEG): container finished" podID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerID="c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645" exitCode=143 Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.674981 4626 scope.go:117] "RemoveContainer" containerID="c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.675007 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f4swc/must-gather-jj8rt" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.689218 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/428281f6-5309-4718-aeec-82bdc3d2bf08-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "428281f6-5309-4718-aeec-82bdc3d2bf08" (UID: "428281f6-5309-4718-aeec-82bdc3d2bf08"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.700449 4626 scope.go:117] "RemoveContainer" containerID="b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.732513 4626 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/428281f6-5309-4718-aeec-82bdc3d2bf08-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.743841 4626 scope.go:117] "RemoveContainer" containerID="c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645" Feb 23 09:39:04 crc kubenswrapper[4626]: E0223 09:39:04.746303 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645\": container with ID starting with c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645 not found: ID does not exist" containerID="c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.746364 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645"} err="failed to get container status \"c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645\": rpc error: code = NotFound desc = could not find container \"c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645\": container with ID starting with c5d09e9f27d50dd8e56e6c5618ebe27c8dc67b39abf72ac2a0e52f4290840645 not found: ID does not exist" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.746412 4626 scope.go:117] "RemoveContainer" containerID="b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908" Feb 23 09:39:04 crc kubenswrapper[4626]: E0223 09:39:04.747849 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908\": container with ID starting with b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908 not found: ID does not exist" containerID="b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908" Feb 23 09:39:04 crc kubenswrapper[4626]: I0223 09:39:04.747887 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908"} err="failed to get container status \"b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908\": rpc error: code = NotFound desc = could not find container \"b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908\": container with ID starting with b141ce2a76da11dbc151bd2521397cf8a32ab32c7f5169ec1e705b70228d1908 not found: ID does not exist" Feb 23 09:39:05 crc kubenswrapper[4626]: I0223 09:39:05.994055 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" path="/var/lib/kubelet/pods/428281f6-5309-4718-aeec-82bdc3d2bf08/volumes" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.672144 4626 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r6mlj"] Feb 23 09:39:19 crc kubenswrapper[4626]: E0223 09:39:19.674061 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="gather" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674094 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="gather" Feb 23 09:39:19 crc kubenswrapper[4626]: E0223 09:39:19.674116 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="registry-server" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674123 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="registry-server" Feb 23 09:39:19 crc kubenswrapper[4626]: E0223 09:39:19.674151 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="extract-utilities" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674158 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="extract-utilities" Feb 23 09:39:19 crc kubenswrapper[4626]: E0223 09:39:19.674168 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="extract-content" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674178 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="extract-content" Feb 23 09:39:19 crc kubenswrapper[4626]: E0223 09:39:19.674193 4626 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="copy" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674198 4626 state_mem.go:107] "Deleted CPUSet assignment" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="copy" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674395 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="gather" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674424 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="428281f6-5309-4718-aeec-82bdc3d2bf08" containerName="copy" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.674434 4626 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cfaf627-b1e9-40c8-914b-ca56197c9a63" containerName="registry-server" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.675760 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.692534 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6mlj"] Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.693846 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p49s\" (UniqueName: \"kubernetes.io/projected/7dd1dc8e-2179-4241-83d0-de1730ceda1f-kube-api-access-4p49s\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.693904 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-utilities\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.694109 4626 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-catalog-content\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.797977 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p49s\" (UniqueName: \"kubernetes.io/projected/7dd1dc8e-2179-4241-83d0-de1730ceda1f-kube-api-access-4p49s\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.798151 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-utilities\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.798407 4626 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-catalog-content\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.798606 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-utilities\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.798858 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-catalog-content\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.819336 4626 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p49s\" (UniqueName: \"kubernetes.io/projected/7dd1dc8e-2179-4241-83d0-de1730ceda1f-kube-api-access-4p49s\") pod \"certified-operators-r6mlj\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:19 crc kubenswrapper[4626]: I0223 09:39:19.993418 4626 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:20 crc kubenswrapper[4626]: I0223 09:39:20.486764 4626 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6mlj"] Feb 23 09:39:20 crc kubenswrapper[4626]: I0223 09:39:20.847016 4626 generic.go:334] "Generic (PLEG): container finished" podID="7dd1dc8e-2179-4241-83d0-de1730ceda1f" containerID="494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a" exitCode=0 Feb 23 09:39:20 crc kubenswrapper[4626]: I0223 09:39:20.847139 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerDied","Data":"494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a"} Feb 23 09:39:20 crc kubenswrapper[4626]: I0223 09:39:20.848425 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerStarted","Data":"0940d74edf440cfbea844dc2527df56fccb6c6d9609d6a3f479fca890fcbb4bd"} Feb 23 09:39:20 crc kubenswrapper[4626]: I0223 09:39:20.850059 4626 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:39:21 crc kubenswrapper[4626]: I0223 09:39:21.864184 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerStarted","Data":"bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f"} Feb 23 09:39:23 crc kubenswrapper[4626]: I0223 09:39:23.884981 4626 generic.go:334] "Generic (PLEG): container finished" podID="7dd1dc8e-2179-4241-83d0-de1730ceda1f" containerID="bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f" exitCode=0 Feb 23 09:39:23 crc kubenswrapper[4626]: I0223 09:39:23.885076 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerDied","Data":"bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f"} Feb 23 09:39:24 crc kubenswrapper[4626]: I0223 09:39:24.898593 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerStarted","Data":"fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913"} Feb 23 09:39:24 crc kubenswrapper[4626]: I0223 09:39:24.922357 4626 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r6mlj" podStartSLOduration=2.434802336 podStartE2EDuration="5.92233376s" podCreationTimestamp="2026-02-23 09:39:19 +0000 UTC" firstStartedPulling="2026-02-23 09:39:20.849686375 +0000 UTC m=+10713.189015641" lastFinishedPulling="2026-02-23 09:39:24.337217799 +0000 UTC m=+10716.676547065" observedRunningTime="2026-02-23 09:39:24.917996032 +0000 UTC m=+10717.257325298" watchObservedRunningTime="2026-02-23 09:39:24.92233376 +0000 UTC m=+10717.261663016" Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.685333 4626 patch_prober.go:28] interesting pod/machine-config-daemon-2jvsw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.685409 4626 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.685468 4626 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.686690 4626 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384"} pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.686765 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" containerName="machine-config-daemon" containerID="cri-o://ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" gracePeriod=600 Feb 23 09:39:25 crc kubenswrapper[4626]: E0223 09:39:25.810619 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.914396 4626 generic.go:334] "Generic (PLEG): container finished" podID="1b11f67b-b1fe-456a-843e-471433062d6c" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" exitCode=0 Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.914461 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" event={"ID":"1b11f67b-b1fe-456a-843e-471433062d6c","Type":"ContainerDied","Data":"ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384"} Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.914538 4626 scope.go:117] "RemoveContainer" containerID="b9b2b1dc10e03fbf1190e5ec1f19a1cd0f71f9331cc439eaba9e224b7d34e112" Feb 23 09:39:25 crc kubenswrapper[4626]: I0223 09:39:25.915851 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:39:25 crc kubenswrapper[4626]: E0223 09:39:25.916482 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:39:26 crc kubenswrapper[4626]: I0223 09:39:26.092207 4626 scope.go:117] "RemoveContainer" containerID="a41d3286d41d2b71fab167aa7b644e39c28af259e30b0c3db405ac0f40424f39" Feb 23 09:39:29 crc kubenswrapper[4626]: I0223 09:39:29.993980 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:29 crc kubenswrapper[4626]: I0223 09:39:29.994454 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:30 crc kubenswrapper[4626]: I0223 09:39:30.036566 4626 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:31 crc kubenswrapper[4626]: I0223 09:39:31.009879 4626 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:31 crc kubenswrapper[4626]: I0223 09:39:31.064112 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6mlj"] Feb 23 09:39:32 crc kubenswrapper[4626]: I0223 09:39:32.990095 4626 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r6mlj" podUID="7dd1dc8e-2179-4241-83d0-de1730ceda1f" containerName="registry-server" containerID="cri-o://fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913" gracePeriod=2 Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.472623 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.624804 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-utilities\") pod \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.625011 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p49s\" (UniqueName: \"kubernetes.io/projected/7dd1dc8e-2179-4241-83d0-de1730ceda1f-kube-api-access-4p49s\") pod \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.625063 4626 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-catalog-content\") pod \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\" (UID: \"7dd1dc8e-2179-4241-83d0-de1730ceda1f\") " Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.625910 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-utilities" (OuterVolumeSpecName: "utilities") pod "7dd1dc8e-2179-4241-83d0-de1730ceda1f" (UID: "7dd1dc8e-2179-4241-83d0-de1730ceda1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.633141 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd1dc8e-2179-4241-83d0-de1730ceda1f-kube-api-access-4p49s" (OuterVolumeSpecName: "kube-api-access-4p49s") pod "7dd1dc8e-2179-4241-83d0-de1730ceda1f" (UID: "7dd1dc8e-2179-4241-83d0-de1730ceda1f"). InnerVolumeSpecName "kube-api-access-4p49s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.671374 4626 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dd1dc8e-2179-4241-83d0-de1730ceda1f" (UID: "7dd1dc8e-2179-4241-83d0-de1730ceda1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.728129 4626 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.728165 4626 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd1dc8e-2179-4241-83d0-de1730ceda1f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:33 crc kubenswrapper[4626]: I0223 09:39:33.728177 4626 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p49s\" (UniqueName: \"kubernetes.io/projected/7dd1dc8e-2179-4241-83d0-de1730ceda1f-kube-api-access-4p49s\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.000541 4626 generic.go:334] "Generic (PLEG): container finished" podID="7dd1dc8e-2179-4241-83d0-de1730ceda1f" containerID="fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913" exitCode=0 Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.000610 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerDied","Data":"fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913"} Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.000654 4626 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6mlj" event={"ID":"7dd1dc8e-2179-4241-83d0-de1730ceda1f","Type":"ContainerDied","Data":"0940d74edf440cfbea844dc2527df56fccb6c6d9609d6a3f479fca890fcbb4bd"} Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.000664 4626 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6mlj" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.000678 4626 scope.go:117] "RemoveContainer" containerID="fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.028079 4626 scope.go:117] "RemoveContainer" containerID="bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.066705 4626 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6mlj"] Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.079646 4626 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r6mlj"] Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.081543 4626 scope.go:117] "RemoveContainer" containerID="494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.097795 4626 scope.go:117] "RemoveContainer" containerID="fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913" Feb 23 09:39:34 crc kubenswrapper[4626]: E0223 09:39:34.098322 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913\": container with ID starting with fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913 not found: ID does not exist" containerID="fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.098434 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913"} err="failed to get container status \"fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913\": rpc error: code = NotFound desc = could not find container \"fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913\": container with ID starting with fe98ef3fdc1f50dd9d2c4d778623af6bbe8b6bdedeb15ad147f7242703cb5913 not found: ID does not exist" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.098568 4626 scope.go:117] "RemoveContainer" containerID="bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f" Feb 23 09:39:34 crc kubenswrapper[4626]: E0223 09:39:34.099834 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f\": container with ID starting with bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f not found: ID does not exist" containerID="bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.099874 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f"} err="failed to get container status \"bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f\": rpc error: code = NotFound desc = could not find container \"bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f\": container with ID starting with bfbb3e84bc073ec476bf17e4cd9ac0c4b1b8ab802e6014bddfa3acfdfb82199f not found: ID does not exist" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.099904 4626 scope.go:117] "RemoveContainer" containerID="494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a" Feb 23 09:39:34 crc kubenswrapper[4626]: E0223 09:39:34.100287 4626 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a\": container with ID starting with 494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a not found: ID does not exist" containerID="494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a" Feb 23 09:39:34 crc kubenswrapper[4626]: I0223 09:39:34.100321 4626 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a"} err="failed to get container status \"494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a\": rpc error: code = NotFound desc = could not find container \"494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a\": container with ID starting with 494a0720d095adcb0afdad512fb08f2cebbe4336fc4a4392393203e3a959882a not found: ID does not exist" Feb 23 09:39:35 crc kubenswrapper[4626]: I0223 09:39:35.992371 4626 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd1dc8e-2179-4241-83d0-de1730ceda1f" path="/var/lib/kubelet/pods/7dd1dc8e-2179-4241-83d0-de1730ceda1f/volumes" Feb 23 09:39:37 crc kubenswrapper[4626]: I0223 09:39:37.987749 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:39:37 crc kubenswrapper[4626]: E0223 09:39:37.988483 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:39:52 crc kubenswrapper[4626]: I0223 09:39:52.982394 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:39:52 crc kubenswrapper[4626]: E0223 09:39:52.983473 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:40:05 crc kubenswrapper[4626]: I0223 09:40:05.983792 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:40:05 crc kubenswrapper[4626]: E0223 09:40:05.985651 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:40:16 crc kubenswrapper[4626]: I0223 09:40:16.982979 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:40:16 crc kubenswrapper[4626]: E0223 09:40:16.984228 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:40:28 crc kubenswrapper[4626]: I0223 09:40:28.982160 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:40:28 crc kubenswrapper[4626]: E0223 09:40:28.983949 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:40:43 crc kubenswrapper[4626]: I0223 09:40:43.982040 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:40:43 crc kubenswrapper[4626]: E0223 09:40:43.982982 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:40:58 crc kubenswrapper[4626]: I0223 09:40:58.982156 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:40:58 crc kubenswrapper[4626]: E0223 09:40:58.983025 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:41:10 crc kubenswrapper[4626]: I0223 09:41:10.984219 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:41:10 crc kubenswrapper[4626]: E0223 09:41:10.985963 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:41:21 crc kubenswrapper[4626]: I0223 09:41:21.986296 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:41:21 crc kubenswrapper[4626]: E0223 09:41:21.988055 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:41:34 crc kubenswrapper[4626]: I0223 09:41:34.981839 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:41:34 crc kubenswrapper[4626]: E0223 09:41:34.982896 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:41:48 crc kubenswrapper[4626]: I0223 09:41:48.983321 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:41:48 crc kubenswrapper[4626]: E0223 09:41:48.984252 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:42:03 crc kubenswrapper[4626]: I0223 09:42:03.982527 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:42:03 crc kubenswrapper[4626]: E0223 09:42:03.983559 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:42:18 crc kubenswrapper[4626]: I0223 09:42:18.982471 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:42:18 crc kubenswrapper[4626]: E0223 09:42:18.983054 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:42:30 crc kubenswrapper[4626]: I0223 09:42:30.981846 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:42:30 crc kubenswrapper[4626]: E0223 09:42:30.982832 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" Feb 23 09:42:41 crc kubenswrapper[4626]: I0223 09:42:41.982903 4626 scope.go:117] "RemoveContainer" containerID="ba4992e1e4ee30125f6d1125aa397d0ed5225576763c01d36c6da490c2334384" Feb 23 09:42:41 crc kubenswrapper[4626]: E0223 09:42:41.983832 4626 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2jvsw_openshift-machine-config-operator(1b11f67b-b1fe-456a-843e-471433062d6c)\"" pod="openshift-machine-config-operator/machine-config-daemon-2jvsw" podUID="1b11f67b-b1fe-456a-843e-471433062d6c" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515147020441024443 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015147020442017361 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146772660016523 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146772660015473 5ustar corecore